**Alternating and Symmetric Polynomials**

Consider the following recipe for building symmetric polynomials, using alternating polynomials. Consider the Vandermonde determinant

To see the last equality, note that the determinant is zero if for any , so it is divisible by , and all these factors are distinct. By degree-counting (the polynomial is evidently homogeneous of degree ), this is the whole thing, up to a scalar. Finally, to get the scalar, we do some computation (e.g. plugging in convenient values like ).

If, instead of the row , we used some other sequence of polynomials, like , the result would still be alternating in the ‘s, so it would still be divisible by the product above. However, the degree-counting argument might no longer be relevant. (For example, if we use , then the result is the original Vandermonde determinant times .)

Still, we can divide out the Vandermonde determinant, and (surprise!) the result will be a *symmetric* polynomial, since the sign-change is “divided out” as well.

For example, we might as well consider

,

for any sequence of exponents . Of course, the determinant will only be nonzero if they’re all distinct, and by rearranging we can assume . It's worth noting that the "Vandermonde case" of is the simplest one – in fact, it is the unique(!) alternating polynomial of minimal degree, up to scaling. We may as well describe the others in terms of it: let's define

,

for any partition . (So the “zero partition” corresponds to the Vandermonde determinant itself.) Now we get a class of symmetric polynomials

,

each of which is homogeneous of degree . We have the following theorem:

**Theorem**. The ratio of alternants is the Schur polynomial.

In fact, this was the original definition of the Schur polynomials, before Young tableaux were used. The proof is by clearing the denominator and comparing monomial coefficients. As usual, it boils down to one particular computation, namely:

**Lemma**. Let be a partition, , and . Then

This is proved using a sign-canceling involution on , where is the “highest-weight tableau”, whose -th row is all s. It’s in the same spirit as the Bender-Knuth involution: it changes the weight of each tableau from to , where is a transposition ( is the first row where differs from .) This makes all the terms cancel, except possibly the term itself (if it appears at all — it has weight ). Finally, we check that can only show up in the term, and then only if .

**Jacobi-Trudi and Pieri**

It’s time to discuss a couple of big formulas involving the Schur polynomials. The first, the Jacobi-Trudi formula, is both theoretically-useful, as it expresses as a polynomial in the ‘s, and has a beautiful proof from graph theory. The second, called the Pieri rule, is a useful formula for multiplying complicated Schur polynomials by simple ones; in particular, it’s way easier to use than the Littlewood-Richardson rule.

**Theorem (Jacobi-Trudi)**. The Schur polynomial satisfies

,

that is, it is the determinant of the matrix whose -th row is

,

with down the main diagonal.

The proof of Jacobi-Trudi uses the Lindstrom-Gessel-Viennot Lemma, which expresses the determinant as a weighted sum over collections of certain nonintersecting paths through a graph. Whenever the paths intersect, we flip them around each other and they cancel out. It’s a lovely graph-theoretic argument. I won’t attempt to reproduce the proof here (since it really needs a picture), but it’s worth knowing, particularly as the same approach gives rise to other determinantal formulas as well — generalizations of Jacobi-Trudi, for example, and the Cauchy-Binet formula (which directly establishes the multiplicativity / functoriality of the determinant).

This identity gives another, circuitous proof of the fact that the homogeneous symmetric polynomials generate as a ring: we know independently that the Schur polynomials are an additive basis, and now the Jacobi-Trudi identity expresses the Schurs as polynomials in the ‘s. There will be some other useful consequences later on. On a side note, since , we can also think of this fact as showing that the ‘s generate as a ring.

One final fact: although the Littlewood-Richardson rule (briefly discussed last post) gives a general algorithm for multiplying Schur polynomials, there are other rules for special cases. One of these is the Pieri rule, which is much easier to understand, and computes . I’ll discuss it in detail, as it gives some of the ‘flavor’ for working with alternating polynomials.

**Pieri Rule**. We have

where the sum runs over all such that consists of boxes in a (collection of) **vertical strip**(s). That is, no two of the boxes we added to to get should be in the same row. (To remember this, keep in mind that the partition is, itself, a vertical strip).

*Proof*. We’ll use the ratio of alternants formula somewhat indirectly: we consider and decompose the result into a sum of terms of the form

First, let’s determine the coefficients of monomials in . This is an alternating polynomial, so the coefficient of vanishes if two ‘s are equal. So, up to sign changes, we can assume .

Now, we know that is the determinant with entries . And is the sum of all squarefree monomials of total degree . If we expand the determinant and the product, and consider the monomials we get, we see that we’re trying to get exponents

,

where , means the reversal of (since we’re counting upwards), is a permutation and the third summand has ones and all other entries .

The key observation is: is in strictly-increasing order, and adding ones here and there can, at most, weaken certain inequalities. It can never reverse them. In particular, if the end result is , which is again in strictly-increasing order, then:

- the permutation
*must*be the identity, - the ones must only be in entries so that .

In other words, either or , and the latter case happens precisely when "there's room for it" without breaking the inequality.

Thinking pictorially, this means corresponds to a new partition, that is,

with obtained by adding new boxes to in a “vertical strip” (since each row increments by at most 1), and the placement rule is just guaranteeing that the new boxes result in a valid Young diagram. Moreover, only arises once this way, since it comes from a unique choice of the ones.

To complete the proof, we double-check that this is equal to , the sum over the relevant partitions . This is more or less immediate (it’s alternating, restrict to monomials with strictly increasing exponents, compare coefficients).

**Corollary**. The change-of-basis from to the Schur polynomials is given by

*Proof*. By definition, . Since , we can apply the Pieri rule repeatedly. The coefficient of is, then, the number of ways of getting by a putting together a sequence of vertical strips of the appropriate sizes, starting with an empty tableau.

This is the same data needed to construct a “row-strict” tableau of shape and weight (imagine filling in the -th vertical strip with all ‘s). In other words, (by transposing, so as to be column-strict) the coefficient is the number of SSYTs of shape and weight .

**Some final remarks**

By now, we’ve computed many changes of basis: to , and to (somewhat indirectly, since I only wrote out a polynomial change of basis, not an additive one). Now we also know to .

Ignoring the somewhat-less-important power symmetric polynomials, we have almost every change of basis: we’re just missing to and to . (It’s also easy to get to .) Stay tuned!

[…] Everything you wanted to know about symmetric polynomials, part IV […]