**Goals for this post:**

- coordinate systems
- line and vector bundles and divisors
- the functor of points perspective
- the homogeneous coordinate ring is a UFD

**Coordinates**

I’ll mostly think of the Grassmannian from the linear algebra perspective, that is,

So, given a subspace , the basic approach is to represent by a matrix of row vectors that span .

For every maximal minor of our matrix, corresponding to a choice of columns , we have the -th Plücker coordinate . These numbers are not well-defined, since we can multiply on the left by any square matrix, which will rescale all the simultaneously by . On the other hand, this simultaneous rescaling means that the **do** make sense as homogeneous coordinates, that is, the map

listing out the Plücker coordinates, is well-defined. (Note that some minor must be nonzero, since the matrix is full-rank.) In coordinate-free language, this is the map that sends a subspace to .

This is not the only useful coordinate system on : there are also standard affine charts. In particular, consider the open set of matrices whose -th Plücker coordinate is nonzero (fixing the choice of ). The columns therefore form an **invertible** matrix in . Let us multiply on the left by the inverse of this matrix, to get something of the form

where the columns are now an identity matrix. (The above corresponds to in .)

The key observation is this: our new choice of representative is the **unique** representative having an identity matrix in the selected columns (since acting nontrivially by on the left would change this identity matrix). In particular, this means that our set is an affine space: . This gives a convenient affine chart for each choice of .

**Corollary**. The Grassmannian is locally isomorphic to affine space . In particular, its dimension is , and it is irreducible and smooth.

**Line Bundles, Vector Bundles and Divisors**

Divisors on are pretty easy to understand. We have the following:

**Theorem**. The Picard group of the Grassmannian is isomorphic to , generated by the divisor class of a hyperplane section in the Plücker embedding, such as the vanishing of a Plücker coordinate.

*Proof*: We use the well-known right-exact sequence of Picard groups

,

where is the complement of an integral Weil divisor , and the first arrow maps . We take to be the vanishing locus of a Plücker coordinate (we will see next post that this is irreducible – in fact, birational to a smaller affine space) and , so . Finally, note that the first arrow is actually injective, since is an effective divisor on a projective variety, hence it can’t be a torsion element in the Picard group.

Wonderful! Now we understand how divisors and line bundles work: up to isomorphism, every line bundle is isomorphic to a tensor power of the Plücker line bundle, which we often refer to as .

The Grassmannian has several important vector bundles as well. Most importantly, it has a “tautological short exact sequence” of bundles,

,

where is the trivial vector bundle whose fiber at every point is the vector space where we started; is the *tautological subbundle*, whose fiber at the point is the subspace itself; and lastly is the *tautological quotient bundle*, whose fiber at is the quotient vector space .

Note that is a globally-generated vector bundle of rank , and has rank and has no global sections at all.

Later on we’ll know the total Chern classes of these bundles, but for now it’s easy enough to compute their top Chern classes. Recall that for a vector bundle of rank , the top Chern class is the locus where generic sections will fail to span, or equivalently, the wedge product of the sections is zero. Unwinding this definition a little, observe that the operation “take the determinant of columns ” is precisely the kind of wedge product necessary to compute . In other words, the Plücker coordinates are (all the) global sections of ! And from the identity it follows that , the negative of the Plücker divisor class.

And, by the tautological short exact sequence above, it also follows that , that is, .

There’s one last relevant vector bundle: the *tangent bundle* , which has rank . The somewhat tricky important observation here is the following: every way to deform or “nudge” a point can be encoded as a linear map . If has basis , we consider the new space with basis . (It’s possible to formalize this in terms of wedge products.) However, two different ‘s will give the same tangent vector if their difference has image lying in itself, so really the tangent space is . And so, the tangent bundle itself is

Now, we use the identity to conclude that the anticanonical divisor of the Grassmannian is

**The Functor of Points Perspective**

The “functor of points perspective” of scheme theory is the idea of understanding a scheme or variety by understanding maps to it, that is, given a scheme , we wish to understand the functor from schemes to sets. (A functor from schemes to sets that arises this way is called **representable**.)

For example, a map to is the same as a globally-defined regular function: .

More relevant to our current discussion is the functor of points of projective space: a map is the same as an -tuple of global sections of a line bundle that globally generate . Given the global sections , we can construct the map explicitly by building a projective space with hyperplane sections that restrict to ; conversely, given the map , we obtain the line bundle by pulling back .

Slightly more is going on here: in fact we have a whole short exact sequence of bundles to pull back:

,

consisting of the tautological line bundle, trivial vector bundle, and tautological quotient bundle of projective space. Note that is pulled back from the *dual* of the tautological bundle.

Maps to the Grassmannian directly generalize this setup. A map from a scheme or variety to the Grassmannian is given by the choice of a rank- vector bundle , together with a choice of sections that globally generate it. (Equivalently, a locally-free rank- quotient sheaf of a trivial rank- sheaf.) Explicitly, at each point , the sections generate a -dimensional vector space (since has rank ), which is a quotient of our space of global sections. The dual is a -dimensional subspace of the dual (this is also the reason for the dualizing step in the last paragraph).

In terms of vector bundles, this means that is obtained by pulling back , the dual of the tautological sub-bundle on .

Note: sometimes the functor of points of is instead described as *the set of rank- locally free subsheaves of a rank- trivial sheaf, whose cokernel is also locally-free.*

This is a nice ‘physical’ description: maps to the Grassmannian are given by assigning to each point an actual -dimensional subspace of a fixed -dimensional vector space. That said, I like my above description a bit better for its explicitness and similarity to the case of . (Also, the weird-sounding ‘quotient is also locally free’ bit doesn’t come up when you phrase things the other way: if you start with a locally free quotient , the kernel is automatically locally free, by Nakayama’s lemma.)

**The Homogeneous Coordinate Ring is a UFD**

I will probably not have time for this in the mini-course, but if I did, I would prove the following:

**Theorem**. The homogeneous coordinate ring of the Grassmannian in its Plücker embedding is a unique factorization domain.

*Proof sketch*: This is in three steps. First, we show that the coordinate ring is the same as the ring of invariants (thinking of as the entries of our matrix.) It’s clear that the Plücker coordinates themselves are in this ring; to show that there’s nothing else, we use a combinatorial proof, basically using a monomial ordering on the .

Second, we use the representation theory of : namely, all -representations are restrictions of representations of , and all the 1-dimensional representations of the latter are powers of the determinant. In particular, they are trivial on .

Third, we prove the theorem. Fix in the ring of invariants, and factor in the “upstairs” ring. We will show that this factorization already consists of -invariant factors. Observe first that must act by permuting the factors (and possibly introducing scalars). In particular, the (closed) subgroup of that fixes the factors is a finite-index subgroup. But is connected, so this subgroup must be all of . So all the factors are fixed. But then the only thing that can change is a scalar, and the map given by is a homomorphism. By step 2, it must be trivial, so the scalar is 1, and so the factorization descends to the invariant ring.

That completes the (very cool, in my opinion) proof, and incidentally provides a second proof that the Picard group is isomorphic to , since this is true any time the homogeneous coordinate ring is a UFD.