Pages

Wednesday, July 20, 2016

(Representation Theory and Fourier Transform I) - Dual spaces and bilinear forms

In this note, we recall several basis knowledge about bilinear forms and its relation to the non-degeneracy properties.

Without mention again, we denoted:
  • $k$ : field.
  • $V, W$ are vector spaces over $k$ with $dim(V) = n$ and $dim(W) = m$.
  • $B_V= \{ e_1, \ldots e_n\}$ and $B_W= \{ g_1, \ldots g_m\}$ are bases of $V$ and $W$. 
1. The linear transform.

A linear transform $f: V \to W$ from vector space $V$ to vector space $W$ is a map that satisfies the following conditions:
$f(v_1+v_2) = f(v_1) + f(v_2)$ and $f(cv)= cf(v)$ 

The set of all linear transforms denoted as $Hom(V,W)$ forms a vector space over $k$ with two laws of composition:
  • addition: $\forall f,g \in Hom(V,W), (f + g)(x) = f(x)+g(x) \in Hom(V,W)$.
  • scalar multiplication: $k \times Hom(V,W) \to Hom(V,W)$ with $(c,f) \mapsto cf(x)$.
It satisfies the following axioms:
  • The zero element is denoted as $0$ that maps every element in $V$ to $0$.
  • The multiplicative identity is $1= 1_W$, where $1f(x) = f(x)$.
  • $Hom(V,W)$ is abelian with the addition: $f(x) + f(-x) = 0$.
  • $\forall a,b, \in k, \forall f\in Hom(V,W), (a+b)f = af +bf$.
  • $\forall a, \in k, \forall f,d \in Hom(V,W),a(f+g) = af + ag$.
A bijective linear transform is called an isomorphism. Two vector space are isomorphic if there exists an isomorphism. By its properties, we see that a linear transform preserves the zero vector and it is determined by the mapping of given bases. For an arbitrary basis $e_i$ of $V$, we have $f(e_i) = \sum_{j=1}^{m}{\alpha_{ij}g_j}$. The matrix $A=(\alpha_{ij})$ is called the representation matrix relatives to the given bases. There is a correlation between a linear transform with its matrix representations. Let's us consider the linear transform of a vector $v\in V$.

$f(v) = \sum_{i=1}^n x_i f(e_i) = [x_1 \ldots x_n] A \begin{bmatrix} g_1   \\ \vdots   \\ g_m \end{bmatrix}$. Hence, the result of a linear transform is actually a result of matrix multiplications.

Lemma 1. Vector spaces are isomorphic if and only if they have the same dimensions.
Proof. If two vector spaces $V$ and $W$ have the same dimensions, they are isomorphic by the linear transform $f(\sum{c_i e_i}) = \sum{c_i w_i}$. For the "only if" part, we suppose that  $f: V \to W$ is a bijective linear transform. The equation $c_1f(e_1) + \ldots + c_nf(e_n)=0$  when $f(\sum{c_i e_i}) = 0$. The obvious answer is $c_1 = \ldots = c_n$ due to the linearly independent property of the bases $\{ e\}_{i=1}^n$ of $V$. Moreover, $f$ is surjective then for each element $w \in W$, there exists $v \in V$ such that $f(v) = w$. Rewrite again, we obtains $\sum{c_i f(e_i)} = f(v) = w$. Hence, the set $f(e_i)$ forms a basis of the vector space $W$.
(Q.E.D)  

Lemma 2. The dimension of $Hom(V,W)$ is $mn$.
Proof. There are two directions to prove this lemma. The first direction is showing the isomorphism between $Hom(V,W)$ and the vector space $M_{n \times m}$. The second direction is the construction of $Hom(V,W)$'s bases. We prefer the last.

Consider a map $T_{ij}: V \to W$, with $T_{ij}(\sum_{k=1}^{n}{x_i e_i}) = x_ig_j$, T is a linear transform since:
  • $T_{ij}(v+w) = T_{ij}(\sum{(x_i+y_i)e_i})  = (x_i + y_i) g_j = T_{ij}(v) + T_{ij}(w)$.
  • $T_{ij}(cv) = T_{ij}(\sum{cx_ie_i})= cx_ig_j = cT_{ij}(v)$.
The set $T_{ij}$ is linearly independent. Suppose that there exists a set of $c_ij$ satifies $\sum{c_{ij}T_{ij}(v)} = 0 $ with every $v\in V$. Then $c_{ij}=0$ when we consider the value of $T_{ij}$ at $\{e_i\}_{i=1}^n$. Moreover, every linear transform $f: V \to W$ can be expressed in the term of $T_{ij}$: $f(e_i) = \sum{\alpha_{ij}g_j} =\sum{\alpha_{ij}T_{ij}(e_i)} $. 
(Q.E.D)

In the case of $W = k$, the vector space $Hom(V,k)$ is also called the dual space of $V$. It reflects the properties of $V$, and we denote it as $V^*$. Consider a projection $p_i$ that is described as follows

 $ p_i: V \to k $, where $\sum{c_i e_i} \mapsto c_i$

, we can prove that $p_i$ is a linear transform and the set $p_i$ forms a basis of $V^*$ in a similar way that we have already did in the previous part. And the corresponding matrix is an identiy matrix.
$A =\begin{bmatrix} p_1(e_1)   & p_1(e_2)  &  \ldots    &   p_1 (e_n) \\ \vdots     &           &\ddots      & \vdots      \\ p_1(e_1)   & p_1(e_2)  &  \ldots    &   p_1 (e_n) \\ \end{bmatrix}$.

Theorem 3. The group $Hom(V,k)$ forms a vector space $V^*$ and $V^* \cong V$.
Proof. Notice that the dimension of $k$ is $1$, applying Lemma 1 and Lemma 2, we got what we want to obtain.

2. The orthogonality
The next part of this note, of course it is always the interesting part in every action-adventure novels (actually, I guess it is a long long novel). What happens if there exists a bilinear mapping from the Cartesian product on V to k. In general, a mapping $< , >: V \times W \to k$ is bilinear if it is linear in both of two components.
  • $<\alpha v ,\beta w> = \alpha \beta <v,w$>
  • $<v_1 + v_2, w> = <v_1,w>+ <v_2,w$>
  • $<v , w_1+w_2> = <v,w_1> + <v,w_2$>  
What kind of property that we can obtain? We will try to give a brief and exact answer for this question by discussing about the relation of non-degenerating bilinear forms and the definition of orthogonality.

Definition 4. A bilinear form $<v,w>: V \times W \to k $ is non-degenerating if
its corresponding matrix relatives to the arbitrary bases is invertible.

This definition also implies that if there exists a bilinear form that connect two vector spaces, then these spaces share the same dimensionality.

Theorem 5. The bilinear form $<v,w>: V \times W \to k $ is non-degenerating if and only if it satisfies these conditions belows:
  • $\forall v\in V, <v,w> = 0 \Rightarrow w = 0$
  • $\forall w\in V, <v,w> = 0 \Rightarrow v = 0$
Proof.
First we show that a non-degenerating bilinear form implies both two conditions. From Lemma 5, we noted that if a bilinear form $<v,w>$ is non-degenerating then $dim(V) = dim(W)$, then we induces:
  • $\forall f_j \in B_W, <v,f_j> = <\sum{x_i e_i},f_j> = \sum{x_i}< e_i,f_j> = [x_1 \ldots x_n] A = 0$ $\Rightarrow [x_1 \ldots x_n] = 0 \Rightarrow v = 0$
  • $\forall e_j \in B_V, <e_j,w> = <e_j, \sum{y_i f_i}> = \sum{y_i}< e_j,f_i> = A\begin{bmatrix} y_1   \\ \vdots   \\ y_n \end{bmatrix} = 0$ $ \Rightarrow \begin{bmatrix} y_1   \\ \vdots   \\ y_n \end{bmatrix} = 0 \Rightarrow w = 0$
from the fact that A is invertible.
For the "if" part, we have $<v,w> = X^T A Y$ with $X,Y$ are coordinates vectors of $v,w$, and $A$ is the corresponding matrix. For every $X$, the homogeneous equation $AY = 0 \Leftrightarrow X^T AY=0$ has the trivial solution when $Y=0$, then $A$ has right inverse. The left inverse can be obtained by using the same proof.
(Q.E.D)
Definition 7: Two vectors $v$ and $w$ are orthogonal ($v \bot w$) if $<v,w> = 0$. The orthogonal space to a subspace $W$ of $V$ is denoted as $W^{\bot} = \{v \in V | \forall w \in W, <v,w>=0\}$. The null vector $v\in V$ is a vector orthogonal to every vectors in V.

By Theorem 5, we see that if a bilinear form is nondegenerate on vector space $V$, then the null vector is $0$ and for every nonzero vector $v$, there is a vector $v'$ such that $<v,v'> \ne 0$. The nondegenerate form is restricted to a sub space $W \le V$, when $\forall w\in W, \exists w' \in W, <w,w'> \ne 0$. Note that a form can be nondegenerate on $V$, though it is nondegenerate on a subspace $W$, and vice versa.

Consider a linear transform between a vector space V and its dual space $V*$ are described as the following:
$\phi: V \to V^*$ with $v \mapsto \phi_v = <v,\_>$
The mapping $\phi$ is linear by the definition of $<,>$. Moreover if $V$ is nondegenerating, then $\phi$ is injective: $\phi_{v_1} = \phi_{v_2} \Leftrightarrow <v_1-v_2,_> = 0 \Leftrightarrow v_1 = v_2$. By Rank-Nullity Theorem, $\text{dim} Ker (\phi) + \text{dim} Im(\phi) = \text{dim}V \Leftrightarrow \text{dim}Im{\phi} = \text{dim}V = \text{dim}V^*$, the linear map $\phi$ is isomorphic. We also obtain the same conclusion for the mapping $\varphi: W \to W^*$ with $w \mapsto \varphi_w= <\_,w>$

Theorem 8: The bilinear form $<,>$ is nondegenerate and symmetric on $W \le V$ if and only if $V = W \oplus W^\bot$.
Proof.
The "$\Rightarrow$" part: the bilinear form $<v>$ is nondegenerate then $W \cap W ^\bot = \{ 0 \}$. We will show that $v = w +u$ with $w\in W$ and $u \in W^\bot$. Consider an isomorphism:
$\phi: W \to W^*$ with $w \mapsto <w,\_>|_W$.
There exists a $<w,\_>|_W$ such that for each $<v,\_>|_W$, $<w,\_>|_W = <v,\_>|_W$ $\Leftrightarrow <v-w,\_>|_W = 0$. Hence, $v-w \in W^\bot$ and $v \in W + W^\bot$.
The "$\Leftarrow$" part: $V = W \oplus W^\bot \Rightarrow W \cap W^\bot = \{ 0 \} \Rightarrow$ the form is nondegenerate.
(Q.E.D)

Theorem 9: Let $<,>$ is nondegenerate on $V$, then $dim W + dim W^\bot = dim V$ and $<,>$ is nondegenerate on $W$ iff it is nondegenerate on $W^\bot$.
Proof.
  • Suppose that $<,>$ is nondegenerate on $V$. Consider a linear map $f: V \to W^*$ with $f(v) = <v,\_>|W$. We have $Kerf = W^\bot$, then $V/W^\bot = Imf = W^* \Leftrightarrow \text{dim} V = \text{dim} W^\bot + \text{dim} W^* = \text{dim} W^\bot + \text{dim} W$.
  • Suppose that $W^\bot$ is degenerate, then $\forall a\in W^\bot, \exists 0 \ne b\in W^\bot, <a,b> = 0 \Rightarrow b \in (W^\bot)\bot = W \Rightarrow W \cap W\bot \ne \{ 0\} \Rightarrow$ contradiction $\Rightarrow W$ is nondegenerate.
(Q.E.D)

Theorem 10: Let $<,>$ be a symmetric form on a real vector space $V$ or a Hermitian form on a complex vector space $V$. There exists an orthogonal basis for V.
(We left it as an exercise. Hint: using the Gram Smith process to figure out an orthogonal basis.)

References: The material in this note is taken from:
[1] Thuong's Talk.
[2] Algebra, Michael Artin.
[3] Bilinear Form, Keith Conrad, http://www.math.uconn.edu/~kconrad/blurbs/linmultialg/bilinearform.pdf

No comments:

Post a Comment