Let us fix a filling of shape and consider the surjective homomorphism of -modules

given by right-multiplying by Specifically, we will describe its kernel, which will have interesting consequences when we examine representations of later.

## Row and Column Tabloids

By the lemma here, and can be described as follows.

For the former, take a basis comprising of “**row tabloids**“, i.e. for each filling *T*, take each row as a set, thus giving a partitioning E.g. in the example below, the first two row tabloids are identical since they give rise to the same partitioning.

For , take a basis of “**column tabloids**“, by taking a filling *T* and taking its columns as sets, with the condition that swapping two elements in the same column flips the sign. E.g.

**Note**

- Given a filling
*T*, the corresponding row tabloid is denoted by {*T*} and the row tabloid is [*T*].

- If we have and . E.g. (1,2) takes the above column tabloid to its negative while (1,2)(3,4) leaves it invariant.

- Let us describe explicitly the correspondence between a column tabloid and an element of For any column tabloid
*U*= [*T*], where*T*is a filling, write for a unique The corresponding element is- The reader may worry that we left out the twist but you may check that this is consistent when we swap two elements in the same column.

- For a row tabloid, we take to the element .

## Column Swaps

Let us define some operations on column tabloids. Consider a column tabloid *U* := [*T*].

Suppose, in *U*, we have a (possibly empty) set *B* of boxes in column *j’* and a set *A* of squares in column *j* such that |*A*| = |*B*|, with *j’* > *j*. Let denote the resulting column tabloid obtained by swapping squares *A* and *B* in order.

Now given a set *B* of squares in column *j’* of *U*, define a linear map

where *A* runs through all sets of squares in column *j* of *U* such that |*A*| = |*B*|. Next, given a set *Y* of squares in column *j’* of *U*, let:

Clearly is an integer linear combination of the ; the following shows that the converse is true.

Lemma 1. For each set of squares in column , we have:

**Proof**

Fix *A*, *B* satisfying and |*A*| = |*B*|; we keep track of the coefficient of in the expansion of LHS. The term is included in for all *Y* satisfying Its coefficient is thus:

If *B* = *Y’*, this is 1; otherwise, fixing some , there is a 1-1 correspondence between *Y* containing *y* and *Y* not containing *y*; these cancel each other out so the sum is 0. Hence the overall sum is:

as desired. ♦

Lemma 2. We have if Y is a non-empty set of squares in the j’-th column of U.

**Proof**

Let be subgroups defined as follows:

*H*is the set of acting as the identity outside*Y*or the*j*-th column of*U*;*K*is the set of taking and the*j*-th column of*U*back to itself.

For each we have so

Let run through all swaps of a subset *A* of the *j*-th column with preserving the order; this gives a complete set of coset representatives for On the other hand, the sum of these gives us so:

It remains to show that the LHS is in ker(π).

Fix filling *T* such that *U* = [*T*]. Unwinding the definition for map π, we first take such that then take the row tabloid for . Now, so

Since is -linear, we now need to show:

Since pick a square in it and a square to its left in the *j*-th column (since *j* < *j’*). Fix the element and let be the transposition that swaps those two squares in so . Taking only in the outer sum of (*) gives:

Summing over all cosets of and we see that (*) sums to zero. ♦

From lemmas 1 and 2, we have:

Corollary. We have:

E.g. modulo ker(π), we have: