Last post, I introduced the ring of symmetric polynomials in infinitely-many variables. Its elements are infinite sums of monomials (of bounded degree), symmetric in all the variables. I also defined the five most common bases for it: the monomial, elementary, (complete) homogeneous, power and Schur polynomials, for short. All of them are indexed by partitions , and in each case the basis element is homogeneous of (total) degree .

Now, I’m going to prove that each of these is a basis for .

As I noted last post, the monomial symmetric polynomials are obviously a basis – if a symmetric polynomial includes some monomial term, then it must include the entire corresponding . Thus is a linear combination of monomial symmetric polynomials; and it’s again obvious that there are no relations between the .

Throughout this post, we’ll use the following key idea: let be a basis for a vector space, and a set of vectors indexed by the same set . Suppose that is partially ordered, and that we can express the ‘s in terms of the ‘s in the form

Then the ‘s are also a basis. In fact, this means the change-of-basis matrix is upper-triangular (or maybe lower-triangular), with units down the diagonal, so it must be invertible. This is one of the nicest ways to establish (discover?) that something is a basis, since determinants are, in general, not always easy to compute.

**The Elementary Symmetric Polynomials**

Let’s express the elementary symmetric polynomials in the basis. The result will be “upper-triangular”, as described above. To begin, let’s write out the product . For a set of indices , let denote the corresponding (squarefree) monomial.

To expand this out, we need to select, for each , a set of size . We can visualize the result as a horizontally-infinite matrix of zeros and ones, where the rows correspond to the ‘s and the columns to the variables, and we fill in ‘s to indicate the contents of the ‘s:

This matrix corresponds to . (It comes from expanding out the product .) We see the following:

- The row sums are the sizes of the , that is, the row sums are given by
- The column sums are the multiplicities of the , that is, they give the exponents of a monomial.

In particular, by rearranging the columns, let’s assume that the column sums are nondecreasing, so they form a partition , giving rise to a monomial . The coefficient of this monomial, and by extension the coefficient of in , is *the number of matrices with row sums and column sums .* Call this number , so:

But not every occurs in this sum.

**Claim.** If occurs with a nonzero coefficient, then in the *dominance* ordering, that is, for each ,

Here is the transpose of , obtained by flipping its Young diagram diagonally. (This is a partial ordering, but not a total ordering.)

To see this, consider a matrix as above, with row sums and column sums . Within each row, rearranging the s doesn’t change the row sum. So let’s move all the as far left as possible – so that each row consists of ones, followed by zeros.

What does this do? Well, we’ve made the first few column sums larger, so the new column sums dominate the old ones: . But, by our setup, is the transpose of ! So this says that .

And, we also see that if , then there’s only one possible matrix to get – namely, the one we just reduced to.

So, by our discussion about upper-triangular changes of bases, this shows that the ‘s are upper-triangular in the ‘s.

**Final Remark.** This also shows that is generated as a *ring* by the ‘s, since the ‘s are products in the ‘s. Moreover, there can be no polynomial relations among the ‘s either, since these would yield linear relations among the ‘s!

This shows:

**Theorem**. As rings, , where the latter is a polynomial ring on the generators , with no relations.

Cool.

**The Homogeneous Symmetric Polynomials**

This will be much easier!

We’ll say goodbye to the ‘s. Instead, we’ll relate the ‘s to the ‘s. Consider the following formal power series (from the ring ):

Here we’re cleverly using the variable to index the total degree of each monomial.

Since these series are inverse to each other, we multiply them out to get

Then we compare term-by-term in the variable, and we find:

This is sort of ring-theoretic version of being upper-triangular, since the equation is of the form

lower terms.

In particular, any polynomial expression in the ‘s can be converted into one in the ‘s, and vice versa. This shows:

**Theorem**. The ‘s generate as a ring.

Again by dimension counting in each degree, we get:

**Theorem**. The ‘s are a basis for as a vector space. In particular, is free (i.e. a polynomial ring) on the ‘s.

Done! That was easier. For fun and while it’s on my mind, I’ll remark the following:

**Proposition (The involution)**. Let be defined by sending . (This is allowed because is a polynomial ring on the ‘s. Then also sends . In particular, it is an involution.

*Proof*. The relations we found above are symmetric in the ‘s and the ‘s. That is, satisfies the same relations (relative to ) as does. Hence, working inductively, we must have .

The involution (as it’s called) is a fairly deep operation: we’ll see later that it sends the Schur polynomial to , the transpose. It will have interesting properties in the representation theory of the symmetric group (where it corresponds to tensoring with the alternating representation) and of (where it’s much stranger, and corresponds to switching between exterior powers and symmetric powers… somehow it switches between ‘symmetry’ and ‘alternation’. If anyone reads this and has thoughts on it, I’d love to hear them).

Moving on…

**The Power Symmetric Polynomials**

This will be like the previous approach (via generating functions), only… powered-up.

First, let’s take

(The reason for the off-by-one exponents will be clear in a moment.) The summand at the end is an (alternating) geometric series, so

This is looking similar to the generating function for the ‘s above, but the latter is a product, not a sum. So, let’s integrate! (The extra factor lets us use the chain rule.) Let be the antiderivative of (with constant term ), so

Well, good enough anyways – we got the generating function for the ‘s. In particular,

Expanding out is a bit of a pain, but we don’t have to. Let’s differentiate both sides with respect to . We get:

Now, we combine the last *two* equations to get:

Now we equate coefficients, and we’re done. The punch line is:

**Claim (Newton’s Identities)**. The ‘s and ‘s satisfy:

Since the coefficient of shows up, the ‘s are not quite as nice. We have:

**Theorem**. The ‘s freely generate as a polynomial ring over a characteristic zero field (but not over or in positive characteristic). The ‘s generate additively, under the same circumstances.

As a bonus, we can understand their properties under the involution:

**Theorem**. The involution sends .

This follows from the symmetry between the ‘s and ‘s, and Newton’s identity above.

**Conclusions**

I still have to deal with the Schur polynomials, but they deserve their own post – showing that they are symmetric, that they are a basis (not too hard actually – but it’s similar to the matrix-based argument for the first proof above), and that they have other neat properties. So, I’ll stop here for now.

[…] Everything you wanted to know about symmetric polynomials, part II […]