A symmetric polynomial is a polynomial which doesn’t change under permutations of the variables. So, and so on. Because of the simplicity of the definition (and the ubiquity of polynomials in math), symmetric polynomials show up in a variety of fields: algebraic geometry and combinatorics, Galois theory, birational geometry and intersection theory, linear algebra, and representation theory — to say the least.

Perhaps the most well-known are the **elementary symmetric polynomials** :

So, if our variables are , then:

(With only three variables, for .)

The elementary symmetric polynomials are well-known because they relate the *coefficients* of (single-variable) polynomials to their *roots*. Namely, if is a polynomial of degree with roots , and leading coefficient , then

It’s easy to see this – just expand the product and collect terms. Here, of course, is shorthand for . There are so many cases where it’s useful to relate the roots and coefficients of polynomials that it’s hard to enumerate them! Think eigenvalues, algebraic numbers, interpolation problems, not to mention things like Chern classes and Schubert calculus (the latter of which I’ll get into on this blog soon enough).

In this post, I want to sketch out the basic theory of symmetric polynomials, from an algebraic and combinatorial viewpoint.

**The Ring of Symmetric Polynomials**

As we saw above, some of the ‘s are zero if we don’t have enough variables, even though they are nonzero when we have more. There’s a neat approach that lets us avoid, or at least postpone, this sort of technicality. Instead, we’ll be able to deal with “all” symmetric polynomials, in infinitely-many (or at least arbitrarily-many) variables all at once.

To do this, we’ll define , the ring of symmetric polynomials in infinitely-many variables. Note that does not consist of polynomials in the usual sense – its elements are things like

or

(The latter is the third **power symmetric polynomial**.)

In particular, is not contained in the ring , since its elements are infinite sums of monomials. On the other hand, it’s not totally crazy — all the elements will still have finite degree, so there are no infinite monomials and no power series.

Still, we have to find another way to construct than as a subring of an ordinary ring of polynomials. So, consider the rings of symmetric polynomials in finitely-many variables,

with transition maps given by setting . The inverse limit, , is too big, since it actually does include the power series and infinite monomials we wanted to avoid. For example, it includes crazy stuff like

.

So, we’ll take to be the subring consisting of (inverse limits of) polynomials of bounded degree. (That is, for each , there should be a single upper bound for the degree of every monomial in .)

Of course, in practice we can specialize to for sufficiently large whenever it’s convenient. (Generally, we’ll do this when it’s convenient to think explicitly in terms of the variables , rather than things like and

**Additive and Ring Bases for **

Despite being infinitely-generated, has a lot of structure that keeps it pretty manageable. For one thing, it’s graded by degree, and in each degree it’s a finite-dimensional -vector space, since there are only finitely-many “types” of monomial of a given total degree. (We’ll formalize this in a moment). For example, there’s only one linear symmetric polynomial (up to scaling): .

What I want to do now is give several very natural (and one slightly less natural) bases for , both additively and as a ring. Note: for simplicity, I’m working with coefficients from a field. Working over is fine for all the bases except one, the power symmetric basis. Likewise, working in positive characteristic causes problems with that basis. Oh well – let’s just stick to (characteristic zero) fields, or gloss over the choice of coefficients.

Throughout, let be a partition.

Here are the bases, called .

- The
**monomial symmetric polynomial**is the sum of all monomials of the form - The
**elementary symmetric polynomial**, where, as defined earlier, , the sum of all squarefree monomials of degree - The
**homogeneous symmetric polynomial**, where is the sum of*all*monomials of total degree - The
**power symmetric polynomial**, where, as defined earlier, , the sum of all the -th powers. - The
**Schur polynomial**. Here is a semistandard Young tableau of shape , and is the monomial with exponents given by the weight of . For example, if has three 1s and two 5s, then

Examples: if , then (omitting the Schur polynomial):

A few remarks are in order:

- The monomial symmetric polynomials are obviously an additive basis for . That is, it’s obvious both that they’re linearly independent, and that they span. (If includes some monomial, then by symmetry it must include the whole corresponding .) In particular, observe that the -th graded component of has vector space dimension , the number of partitions of .
- None of the other claimed bases are obviously independent or spanning. Note, however, that all of them are indexed by the same set – the partitions – and that, for each one, is the degree. So, since is finite-dimensional in each degree, it’s actually enough to show just linear independence, or just that they span. We’ll mostly show both, but only because it takes no extra work to do so.
- It’s easy to multiply the bases: by definition, this just results in stringing together the corresponding partitions. It’s not as easy to multiply in the bases – there are some structure constants to figure out,
The structure constants for multiplying the Schur polynomials are called the Littlewood-Richardson numbers, and are pretty deep. We’ll return to them later. The (nameless) constants for multiplying the monomial symmetric polynomials are much easier.

- For the Schur polynomials, it’s not obvious that they are even symmetric polynomials to begin with! Luckily, I indirectly ‘proved’ this in my last post, as a consequence of the RSK correspondence (though I didn’t actually prove RSK). Namely, RSK implies that the Kostka number , the number of tableaux of shape and weight , doesn’t depend on the ordering of . Consequently, we have the expansion
where ranges over partitions. Later, I hope to give a more direct proof of the symmetry of the Schur polynomials, via the very cool Bender-Knuth involution.

I’ll stop here for now. Next post, I’ll prove that the various bases work properly.

[…] Everything you wanted to know about symmetric polynomials, part I […]