Home » 2013 » December

# Monthly Archives: December 2013

## Schur functors

I’m going to describe the basic ideas of the Schur functors, $\mathbb{S}^\lambda(V)$, where $\lambda$ is a partition and $V$ is a vector space. These will turn out to be the complete set of irreducible polynomial representations of $GL_n$ (for all $n$). The main facts to strive for are:

• Every irreducible representation of $GL_n$ is a unique Schur functor. Conversely, every Schur functor is irreducible.
• The character of $\mathbb{S}^\lambda(V)$ is the Schur polynomial $s_\lambda$.
• The dimension of $\mathbb{S}^\lambda(V)$ is the number of SSYTs of shape $\lambda$ and entries from $1, \ldots, n$ (where $n = \dim(V)$.) This fact will be explicit: there will be a “tableau basis” for the representation.

As a corollary, we get an improved understanding of the Littlewood-Richardson numbers and the isomorphism $Rep(\text{GL}) \to \lambda_n$ between the representation ring and the ring of symmetric polynomials.

## GL-representations, symmetric polynomials, and geometry

One of the many applications of symmetric polynomials is to representation theory, and in this post I want to begin sketching out how.

Symmetric polynomials and the ring $\Lambda$ are involved in the representation theory of the symmetric group $S_n$, and the general linear group $GL_n$, in related ways. The precise relationship between the representation theory of these two groups is spelled out in the Schur-Weyl Duality theorem, as well as in explicit constructions of representations of both groups.

I’m mainly interested in the Schur functors, which are representations of $GL_n$, so I’ll be focusing on those.

## Everything you wanted to know about symmetric polynomials, part V

This will be the last post on symmetric polynomials, at least for now. (They’ll continue to come up when I get to representation theory and my true love, algebraic geometry, but only as part of other theories.)

I want to discuss the Hall inner product on the symmetric function ring $\Lambda$ and its interaction with the $\omega$-involution. As a side benefit, we’ll get the “dual” Jacobi-Trudi and Pieri rules, with $e$ and $h$ swapped.

## Everything you wanted to know about symmetric polynomials, part IV

Alternating and Symmetric Polynomials

Consider the following recipe for building symmetric polynomials, using alternating polynomials. Consider the Vandermonde determinant

$\displaystyle{ \Delta(x_1, \ldots, x_n) = \left|\begin{matrix} 1 & x_1 & x_1^2 & \cdots & x_1^{n-1} \\ 1 & x_2 & x_2^2 & \cdots & x_2^{n-1} \\ & \vdots & & \vdots & \\ 1 & x_n & x_n^2 & \cdots & x_n^{n-1} \end{matrix}\right| = \prod_{i < j} (x_j - x_i). }$

To see the last equality, note that the determinant is zero if $x_i = x_j$ for any $i \ne j$, so it is divisible by $(x_j-x_i)$, and all these factors are distinct. By degree-counting (the polynomial is evidently homogeneous of degree $n \choose 2$), this is the whole thing, up to a scalar. Finally, to get the scalar, we do some computation (e.g. plugging in convenient values like $x_i = i$).

If, instead of the row $(1 \ x_i\ x_i^2\ \cdots\ x_i^{n-1})$, we used some other sequence of polynomials, like $(f_1(x_i)\ f_2(x_i)\ \cdots \ f_{n-1}(x_i))$, the result would still be alternating in the $x_i$‘s, so it would still be divisible by the product above. However, the degree-counting argument might no longer be relevant. (For example, if we use $(x_i\ x_i^2\ \cdots\ x_i^n)$, then the result is the original Vandermonde determinant times $x_1 \cdots x_n$.)

Still, we can divide out the Vandermonde determinant, and (surprise!) the result will be a symmetric polynomial, since the sign-change is “divided out” as well.