Polynomials and Representations XII

Lindström–Gessel–Viennot Lemma

Let us switch gears and describe a beautiful combinatorial result. Suppose G is a graph which is directed, has no cycles, and there are only finitely many paths from a vertex to another. Given sets of n vertices:

A =\{a_1, \ldots, a_n\}, \qquad B = \{b_1, \ldots, b_n\}

the lemma computes the “weighted sum” of non-intersecting paths from A to B.

To be specific, suppose each edge e of the graph has a weight w_e which takes values in a fixed ring. Given a directed path P : a\to b from a to b, let w(P) be the product of the edges along the path P. Now write e(a, b) for the sum of w(P) over all P:a\to b.

lgv_lemma_path_sum

Next given a_1, \ldots, a_n, b_1, \ldots, b_n above, we compute the determinant of:

M = \begin{pmatrix} e(a_1, b_1) & e(a_1, b_2) & \cdots & e(a_1, b_n) \\e(a_2, b_1) & e(a_2, b_2) & \cdots & e(a_2, b_n) \\\vdots & \vdots & \ddots & \vdots \\e(a_n, b_1) & e(a_n, b_2) & \cdots & e(a_n, b_n)\end{pmatrix}

On the other hand, consider a collection of paths P = (P_1, \ldots, P_n) where each P_i : a_i \to b_{\sigma(i)} for some permutation \sigma = \sigma(P) \in S_n. Furthermore we want the paths to be pairwise non-intersecting, i.e. no two paths contain a common vertex (not even at the endpoints). Now we can state the desired result.

Lindström–Gessel–Viennot (LGV) Lemma. We have:

\displaystyle\det(M) = \sum_{P = (P_1, \ldots, P_n)} \text{sgn}(\sigma(P)) \prod_{i=1}^n w(P_i)

over all tuples of paths (P_1, \ldots, P_n) described above.

lgv_lemma_path_sum_2

blue-lin

Proof of the LGV Lemma

First expand the determinant of M; each term is of the form:

\text{sgn}(\sigma) \times e(a_1, b_{\sigma(1)}) \times e(a_2, b_{\sigma(2)}) \times \ldots \times e(a_n, b_{\sigma(n)}), \qquad \sigma \in S_n.

Expanding each e(a_i, b_{\sigma(i)}) = \sum_{P_i : a_i \to b_{\sigma(i)}} w(P_i), the above term is a sum of terms of the form

\displaystyle\sum_{P_1, \ldots, P_n}\text{sgn}(\sigma) w(P_1) w(P_2) \ldots w(P_n),

summed over all n-tuples (P_i : a_i \to b_{\sigma(i)})_{i=1}^n. This is close to the desired result but here, the P_i may intersect. Hence, we wish to show that the sum over all (P_1, \ldots, P_n) for which some P_i and P_j intersect, vanishes.

Let \Sigma = \{P = (P_1, \ldots, P_n)\} be the collection of all such paths; for each P\in \Sigma, we:

  • let v be the first vertex occurring as an intersection between P_i and P_j (i\ne j), after taking a total ordering for the set of vertices;
  • let i be the smallest index such that P_i contains v;
  • let j be the largest index such that P_j contains v (note that j\ne i);

and get a triplet (i, v, j). Now let P' be the same tuple of paths but with the tails of P_i and P_j switched after v.

lgv_proof_involution

This gives a map f : \Sigma \to \Sigma, which is an involution since the corresponding triplet for f(P) is still (i, v, j). However \sigma(f(P)) and \sigma(P) have opposite signs. Denoting w(P_1) \ldots w(P_n) by T(P) we have T(f(P)) = T(P) and so:

\displaystyle\sum_{P\in \Sigma} \text{sgn}(\sigma(P)) T(P) = -\sum_{P\in \Sigma} \text{sgn}(\sigma(f(P))) T(f(P)) = -\sum_{P'\in \Sigma} \text{sgn}(\sigma(P')) T(P').

So the sum is zero. ♦

blue-lin

Jacobi-Trudi Identities

Here is our first application of the LGV lemma; all computations are in the formal ring \Lambda.

Theorem. Let \mu = \overline \lambda be the transpose of \lambda. Then s_\lambda is the determinant of either of these matrices.

\left( h_{\lambda_i + j - i} \right)_{1 \le i, j\le d} =\begin{pmatrix} h_{\lambda_1} & h_{\lambda_1 + 1} & \ldots & h_{\lambda_1 + d - 1} \\ h_{\lambda_2 - 1} & h_{\lambda_2} & \ldots & h_{\lambda_2 + d - 2} \\ \vdots & \vdots & \ddots & \vdots \\ h_{\lambda_d - d + 1} & h_{\lambda_d - d + 2} & \ldots & h_{\lambda_d}\end{pmatrix}

\left( e_{\mu_i + j - i} \right)_{1 \le i, j\le d} =\begin{pmatrix} e_{\mu_1} & e_{\mu_1 + 1} & \ldots & e_{\mu_1 + d - 1} \\ e_{\mu_2 - 1} & e_{\mu_2} & \ldots & e_{\mu_2 + d - 2} \\ \vdots & \vdots & \ddots & \vdots \\ e_{\mu_d - d + 1} & e_{\mu_d - d + 2} & \ldots & e_{\mu_d}\end{pmatrix}

where d is chosen to be \ge l(\lambda) and \ge l(\mu) in the respective cases.

Note

One remembers the structure of the matrices by writing \lambda as the subscripts for the diagonal entries, then incrementing to the right. We take h_i = e_i = 0 for all i <0. Observe that since h_0 = e_0 = 1, there is no harm increasing d by appending zeros to the partition.

Proof

We can obtain the second identity from the first by applying \omega, so let us show the first. Before we begin, observe that the Schur polynomial can be written as:

\displaystyle s_\lambda = \sum_{\text{SSYT } T} x^T,

where x^T = \prod_{i\ge 1} x_i^{\text{\# of } i \text{ in } T}. Indeed, this follows from the proposition here.

We will use the LGV lemma: let N be large; we take the following graph.

  • The vertices are lattice points in \mathbb{Z}^2.
  • The starting and ending vertices are given by:

\begin{aligned}a_d &= (0, N), a_{d-1} = (1, N), \ldots, a_1 = (d-1, N), \\ b_d &= (\lambda_d, 1), b_{d-1} = (\lambda_{d-1} + 1, 1), \ldots, b_1 = (\lambda_1 + d-1, 1).\end{aligned}

  • The edges are of one unit, directed either eastwards or southwards. The southward edges are weighted 1 while the eastward edges on y=i are labelled x_i for i\ge 1.

For example if \lambda = (4, 1, 1), here is an example of paths connecting a\to b.

jacobi-trudi_path

Let us compute the LHS of the LGV lemma. The set of paths from a_i = (d-i, N) to b_j = (\lambda_j+ d-j, 1) sums up to the complete symmetric polynomial h_{\lambda_j} + i-j. Thus the LHS is the desired determinant expression.

On the other hand, let us consider the collection of non-intersecting paths from a_1, \ldots, a_d to b_{\sigma(1)}, \ldots, b_{\sigma(d)}. For a path to exist, \sigma must be the identity. Each collection of non-intersecting paths corresponds to a reverse SSYT of shape \lambda (i.e. the rows are weakly decreasing and columns are strictly decreasing). E.g. the above diagram gives:

jacobi-trudi_path_corr

Hence, the RHS in the LGV lemma equals the sum of x^T over all reverse SSYT T of shape \lambda. This is equal to the sum over all SSYT T of shape \lambda (we take m, m' to be the minimum and maximum elements of the table respectively and replace each u with m+m'-u, thus obtaining an SSYT). Thus the sum is s_\lambda. ♦

blue-lin

This entry was posted in Uncategorized and tagged , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s