## Tensor Product and Linear Algebra

Tensor products can be rather intimidating for first-timers, so we’ll start with the simplest case: that of vector spaces over a field K. Suppose V and W are finite-dimensional vector spaces over K, with bases $\{v_1, \ldots, v_n\}$ and $\{w_1, \ldots, w_m\}$ respectively. Then the tensor product $V\otimes_K W$ is the vector space with abstract basis $\{ v_i w_j\}_{1\le i \le n, 1\le j\le m}.$ In particular, it is of dimension mn over K. Now we can “multiply” elements of V and W to obtain an element of this new space, e.g.

$(2v_1 + 3v_2)(w_1 - 2w_3) = 2v_1 w_1 + 3 v_2 w_1 - 4 v_1 w_3 - 6v_2 w_3.$

For example, if V is the space of polynomials in x of degree ≤ 2 and W is the space of polynomials in y of degree ≤ 3, then $V\otimes_K W$ is the space of polynomials spanned by $x^i y^j$ where 0≤i≤2, 0≤j≤3. However, defining the tensor product with respect to a chosen basis is rather unwieldy: we’d like a definition which only depends on V and W, and not the bases we picked.

Definition. A bilinear map of vector spaces is a map $B:V \times W \to X,$ where V, W, X are vector spaces, such that

• when we fix w, B(-, w): V→X is linear;
• when we fix v, B(v, -): W→X is linear.

The tensor product of V and W, denoted $V\otimes_K W$, is defined to be a vector space together with a bilinear map $\psi : V\times W \to V\otimes_K W$ such that the following universal property holds:

• for any bilinear map $B: V\times W \to X$, there is a unique linear map $f:V\otimes_K W \to X$ such that $f\circ \psi = B.$

For v∈V and w∈W, the element $v\otimes w := \psi(v,w)$ is called a pure tensor element.

The universal property guarantees that if the tensor product exists, then it is unique up to isomorphism. What remains is the

Proof of Existence.

Recall that if S is a basis of vector space V, then any linear function VW uniquely corresponds to a function SW. Thus if we let T be the (infinite-dimensional) vector space with basis:

$\{e_{v, w} : v \in V, w\in W\}$

then linear maps gTX correspond uniquely to functions BV×W → X. Saying that B is bilinear is precisely the same as g factoring through the subspace U to obtain $\overline g : T/U \to X,$ where U is the subspace generated by elements of the form:

\begin{aligned} e_{v+v', w} - e_{v,w} - e_{v', w}, \qquad & e_{cv, w} - c\cdot e_{v,w}\\ e_{v, w+w'} - e_{v,w} - e_{v, w'},\qquad & e_{v,cw} - c\cdot e_{v,w}\end{aligned}

for all vv’ ∈ Vww’ ∈ W and constant c ∈ K. Hence T/U is precisely our desired vector space, with $\psi : V\times W \to T/U$ given by $(v, w) \mapsto e_{v,w} \pmod U.$ And vw is the image of $e_{v,w}$ in T/U. ♦

Note

From the proof, it is clear that V ⊗ W is spanned by the pure tensors; in general though, not every element of V ⊗ W is a pure tensor. E.g. vw + v’w’ is generally not a pure tensor. However, vw + vw’ + v’w + v’w’ = (v+v’)⊗(w+w’) is a pure tensor since ψ is bilinear.

## Properties of Tensor Product

We have:

Proposition. The following hold for K-vector spaces:

• $K \otimes_K V \cong V$, where $c\otimes v\mapsto cv$;
• $V \otimes_K W \cong W \otimes_K V$, where $v\otimes w\mapsto w\otimes v$;
• $V \otimes_K (W \otimes_K W') \cong (V\otimes_K W)\otimes_K W'$, where $v\otimes (w\otimes w') \mapsto (v\otimes w)\otimes w'$;
• $V \otimes_K (\oplus_i W_i) \cong \oplus_i (V\otimes W_i)$, where $v\otimes (w_i)_i \mapsto (v\otimes w_i)_i$.

Proof

For the first property, the map K × V → V taking (cv) to cv is bilinear over K, so by the universal property of tensor products, this induces fK ⊗ V → V taking cv to cv. On the other hand, let’s take the linear map gV → K ⊗ V mapping v to 1⊗v. It remains to prove gf and fg are identity maps. Indeed: fg takes v → 1⊗v → v and gf takes cv → cv → 1⊗cvcv where the equality follows from bilinearity of ⊗.

For the third property, fix vV. The map W×W’ → (VW)⊗W’ taking (ww’) to (vw)⊗w‘ is bilinear in W and W’ so it induces $f_v : W\otimes W' \to (V\otimes W)\otimes W'$ taking $w\otimes w' \mapsto (v\otimes w)\otimes w'.$ Next we check that the map

$V\times (W\otimes W') \to (V\otimes W)\otimes W', \qquad (v, x) \mapsto f_v(x)$

is bilinear so it induces a linear map $f : V\otimes (W\otimes W') \mapsto (V\otimes W)\otimes W'$ taking $v\otimes (w\otimes w') \mapsto (v\otimes w)\otimes w'.$ Similarly one defines a reverse map $g: (V\otimes W)\otimes W' \to V\otimes (W\otimes W')$ taking $(v\otimes w)\otimes w' \mapsto v\otimes (w\otimes w').$ Since the pure tensors generate the whole space, it follows that f and g are mutually inverse.

The second and fourth properties are left to the reader. ♦

As a result of the second and fourth properties, we also have:

Corollary. For any collection $\{V_i\}$ and $\{W_j\}$ of vector spaces, we have:

$\oplus_{i, j} (V_i \otimes_K W_j) \cong (\oplus_i V_i)\otimes_K (\otimes_j W_j),$

where the LHS element $(v_i) \otimes (w_j)$ maps to $(v_i \otimes w_j)_{i,j}$ on the RHS.

In particular, if $\{v_i\}$ and $\{w_j\}$ are bases of V and W respectively, then

$V = \oplus_i Kv_i, \ W = \oplus_j Kw_j \implies V\otimes W = \oplus_{i, j} K(v_i \otimes w_j)$

so $\{v_i \otimes w_j\}$ forms a basis of VW. This recovers our original intuitive definition of the tensor product!

## Tensor Product and Duals

Recall that the dual of a vector space V is the space V* of all linear maps VK. It is easy to see that V* ⊕ W* is naturally isomorphic to (V ⊕ W)* and when V is finite-dimensional, V** is naturally isomorphic to V.

[ One way to visualize V** ≅ V is to imagine the bilinear map V* × V → K taking (fv) to f(v). Fixing f we obtain a linear map VK as expected while fixing v we obtain a linear map V*→K and this corresponds to an element of V**. ]

If V is finite-dimensional, then a basis $\{v_1, \ldots, v_n\}$ of V gives rise to a dual basis $\{f_1, \ldots, f_n\}$ of V* where

$f_i(v_j) = \begin{cases} 1, \quad &\text{if } i = j,\\ 0,\quad &\text{otherwise.}\end{cases}$

or simply $f_i(v_j) = \delta_{ij}$ with the Kronecker delta symbol. The next result we would like to show is:

Proposition. Let V and W be finite-dimensional over K.

• We have $V^*\otimes W^* \cong (V\otimes W)^*$ taking (f, g) to the map $V\otimes W\to K, (v\otimes w) \mapsto f(v)g(w).$
• Also $V^* \otimes W \cong \text{Hom}_K(V, W)$ taking (f, w) to the map $V\to W, v\mapsto f(v)w.$

Proof

For the first case, fix fV*, gW*. The map $V\times W \to K$ taking $(v,w)\mapsto f(v)g(w)$ is bilinear so it induces a map $h:V\otimes W\to K$ taking $(v\otimes w)\mapsto f(v)g(w).$ But the assignment (fg) → h gives rise to a map $V^* \times W^* \to (V\otimes W)^*$ which is bilinear so it induces $\varphi:V^* \otimes W^* \to (V\otimes W)^*.$ Note that $f\otimes g$ corresponds to the map $h:V\otimes W\to K$ taking $v\otimes w \mapsto f(v)g(w).$

To show that this is an isomorphism, let $\{v_i\}$ and $\{w_j\}$ be bases of V and W respectively, with dual bases $\{f_i\}$ and $\{g_j\}$ of V* and W*. The map then takes $f_i \otimes g_j$ to the linear map $V\otimes W\to K$ which takes $v_k \otimes w_l$ to $f_i(v_k) g_j(w_l) = \delta_{ik}\delta_{jl}.$ But this corresponds to the dual basis of $\{v_i \otimes w_j\},$ so we see that the above map φ takes a basis $\{f_i \otimes g_j\}$ to a basis: dual of $\{v_i\otimes w_j\}.$

The second case is left as an exercise. ♦

Note

Here’s one convenient way to visualize the above. Suppose elements of V comprise of column vectors. Then V* is the space of row vectors, and evaluating V* × → K corresponds to multiplying a row vector by column vector, thus giving a scalar. So V*W* ≅ (VW)* follows quite easily: indeed, the LHS concatenates two spaces of row vectors, while the RHS concatenates two spaces of column vectors then turns it into a space of row vectors.

The tensor product is a little trickier: for V and W we take column vectors with entries $\alpha_1, \ldots, \alpha_n$ and $\beta_1, \ldots, \beta_m$ respectively. Then we form the column vector with mn entries $\alpha_i \beta_j.$ This lets us see why V*⊗W* ≅ (VW)*: in both cases we get a row vector with mn entries. Finally, to obtain V* ⊗W we take row vectors $\alpha_1, \ldots, \alpha_n$ for elements of V* and column vectors $\beta_1, \ldots, \beta_m$ for those of W, and the these multiply to give us an m × n matrix, which represents linear maps VW:

Question

Consider the map V* × V → K which takes (f, v) to f(v). This is bilinear so it induces a linear map fV*⊗V → K. On the other hand, V*⊗V is naturally isomorphic to End(V), the space of K-linear maps VV. If we represent elements of End(V) as square matrices, what does f correspond to?

[ Answer: the trace of the matrix. ]

## Tensor Algebra

Given a vector space V, let us consider n consecutive tensors:

$V^{\otimes n} := \overbrace{V\otimes V\otimes \ldots \otimes V}^{n \text{ copies}}.$

and let T(V) be the direct sum $\oplus_{n=0}^\infty V^{\otimes n} = K \oplus V \oplus (V\otimes V) \oplus (V\otimes V\otimes V)\ldots.$ This gives an associative algebra over K by extending the bilinear map

$V^{\otimes m} \times V^{\otimes n} \to V^{\otimes (m+n)}, \quad (v_1, v_2) \mapsto v_1 \otimes v_2.$

to the entire space T(V) × T(V) → T(V). Note that it is not commutative in general. For example, suppose V has a basis {xyz}. Then

• $V^{(2)}$ has basis $\{x^2, xy, xz, yx, y^2, yz, zx, zy, z^2\}$, where we have shortened the notation $x^2 := x\otimes x,$ $xy := x\otimes y,$ etc.
• $V^{(3)}$ has basis $\{x^3, x^2 y, \ldots\}$, with 27 elements.
• Multiplying $V\times V^{(2)} \to V^{(3)}$ gives $(x+z)(xy + zx) = x^2 y + xzx + zxy + z^2 x.$

The algebra T(V), called the tensor algebra of V, satisfies the following universal property.

Theorem. The natural map ψ : V → T(V) is a linear map such that:

• for any associative K-algebra A, and K-linear map φ: V → A, there is a unique K-algebra homomorphism f: T(V) → A such that φ = fψ.

Thus, $\text{Hom}_{K-\text{lin}}(V, A) \cong \text{Hom}_{K-\text{alg}}(T(V), A).$

However, often we would like multiplication to be commutative (e.g. when dealing with polynomials) and we’ll use the symmetric tensor algebra instead. Or we would like multiplication to be anti-commutative, i.e. xy = –yx (e.g. when dealing with differential forms) and we’ll use the exterior tensor algebra instead. We will say more about these when the need arises.

This entry was posted in Notes and tagged , , , , , . Bookmark the permalink.