[[Noteworthy Uses]]:: *[[Noteworthy Uses]]*
[[Proved By]]:: *[[Proved By|Crucial Dependencies]]*
Intuition:: *[[Intuition]]*
Specializations:: *[[Specializations]]*
Generalizations:: [[Existence of the Singular Value Decomposition of a Linear Map]]
----
- Let $\ff$ denote $\rr$ or $\cc$;
- Let $\mathscr{X} \in$ [[vector space of m-by-n matrices]].
> [!theorem] Theorem. ([[existence of the singular value decomposition of matrices]])
There exists a decomposition of $\mathscr{X}$ $\mathscr{X}=U\Sigma V^{\dagger},$
> where $U \in \ff^{m \times m}$ and $V^{\dagger} \in \ff^{n \times n}$ are [[isometric matrix|isometric]] and $\Sigma \in$ [[vector space of m-by-n matrices]] is [[diagonal]].
> \
> In particular, the columns of $V$ are the [[right singular vectors]] of $\mathscr{X}$, while the columns of $U$ are the [[left singular vectors]] of $\mathscr{X}$.
(Why did I make this so pedantic lol)
> [!proof]- Proof. ([[existence of the singular value decomposition of matrices]])
> - let $\mathcal{S}_{n}$ denote standard [[basis]] of $\ff ^{n}$; $\mathcal{S}_m$ the standard [[basis]] of $\ff^{m}$;
> - Define $T \in \hom(\ff^{n}, \ff^{m})$ to be the [[linear map]] given by $T(v)=\mathscr{X}v$ (i.e., the [[linear map]] whose [[matrix]] w.r.t. $\mathcal{\mathcal{S}_{n}}$ and $\mathcal{S_{m}}$ is $\mathscr{X}$; $\mathscr{X} = \MM(T, \mathcal{S_{m}}, \mathcal{S_{n}})$ [^1].
> - let $r =$ [[rank]] $\mathscr{X}$.
>
> let $\{ \sigma _j \}_{j=1}^{n}$ be the $n$ [[singular values]] of $T$; let $\{ \sigma _j \}_{j=1}^{r}$ ($r \leq \min \{ m,n \}$) be those which are *nonzero.* Now using the [[Existence of the Singular Value Decomposition of a Linear Map]], there exist [[orthonormal|orthonormal lists]] $\{ e_{j} \}_{j=1}^{r} \subset \ff^{n} \and \{ f_{j} \}_{j=1}^{r} \subset \ff^{m}$
> such that $Tv = \sum_{j=1}^{r} \sigma _j \langle v,e_{j} \rangle f_{j} $
> for all $v \in V$.
> \
> If we [[orthonormal list extends to orthonormal basis|extend]] our [[orthonormal|orthonormal lists]] to [[orthonormal basis|orthonormal bases]] $\mathcal{E} \subset \ff ^{n} \and \mathcal{F} \subset \ff^{m}$, then the [[matrix]] of $T$ with respect to these [[basis|bases]] will be [[diagonal]] ([[main diagonal|in the rectangular sense]]): since $Te_{j}=\sigma _j f_{j},$ $\Sigma :=\MM(T, \mathcal{E}, \mathcal{F}) = \begin{pmatrix}
> \sigma_{1} & & 0 \\
> & \ddots & \\
> 0 & & \sigma _n \\
> — & — & — \\
> & \text{\huge0}
> \end{pmatrix} \text{ or } \begin{pmatrix}
> \sigma_{1} & & 0 &|& \\
> & \ddots & & | & \text{\huge 0}\\
> 0 & & \sigma_{n} &| &
> \end{pmatrix}.$
> let $V:=\MM(I_{n}, \mathcal{E}\to \mathcal{S_{n}})$ be the [[transition matrix]] from $\mathcal{E}$ to $\mathcal{S_{n}}$; i.e., the [[matrix]] whose columns are $\{ e_{j} \}_{j=1}^{n}$. let $U:=\MM(I_{m}, \mathcal{F}\to\mathcal{S_{m}})$ be the [[transition matrix]] from $\mathcal{F}$ to $\mathcal{S}_{m}$; i.e., the [[matrix]] whose first $n$ columns are $\{ f_{j} \}_{j=1}^{n}$. Because trivially $T=I_{m} T I_{n} ^{-1}$and by definition the [[product of matrices is the matrix of the product]], we have $\begin{align}
> \mathscr{X} = & \MM(T, S_{n}, S_{m}) \\
> =& \MM(I_{m}TI_{n} ^{-1}, \mathcal{S}_{n} \to \mathcal{E} \to \mathcal{F} \to \mathcal{S_m}) \\
> = & \MM(I_{m}, \mathcal{F}, \mathcal{S}_m) \MM(T, \mathcal{E}, \mathcal{F}) \MM(I_{n}, \mathcal{S_{n}}, \mathcal{E}) \\
> = & U \Sigma V^{-1} \\
> = & U\Sigma V^{\dagger},
> \end{align}$
> where in the last step we used that $V$ is an [[isometric matrix]].
> \
> The proof of [[Existence of the Singular Value Decomposition of a Linear Map]] tells us that the columns of $V$ are the [[right singular vectors]] of $\mathscr{X}$; below we show that the columns of $U$ are the [[left singular vectors]] of $\mathscr{X}$: $\mathscr{X} \mathscr{X}^{\dagger} = U \Sigma V^{\dagger} V \Sigma U^{\dagger} = U \Sigma^{2} U^{\dagger},$
> and right-multiplying both sides by $U$ yields $\mathscr{X} \mathscr{X}^{\dagger} U = U \Sigma^{2}$. The ($n$) entries along the [[main diagonal]] of $\Sigma^{2}$ are $\sigma _j ^{2}$ (all others are $0$), while the first $\min\{ m,n \}$ columns of $U$ are $\{ f_{j} \}_{j=1}^{n}$. Hence $\mathscr{X} \mathscr{X}^{\dagger} f_{j} =\sigma _j ^{2} f_{j}$ and so $f_{j}$ are our [[orthonormal]] [[eigenvector]]s.
>
>
>
[^1]: In part since if $\{ e_{j} \}_{j=1}^{n}$ denotes the standard [[basis]] we have $T(e_{1}) = Ae_{1}= (\text{first column of }A)$, $Ae_{1}= (\text{second column of }A)$, etc...
----
####
-----
#### References
> [!backlink]
> ```dataview
TABLE rows.file.link as "Further Reading"
FROM [[]]
FLATTEN file.tags
GROUP BY file.tags as Tag
> [!frontlink]
> ```dataview
> TABLE rows.file.link as "Further Reading"
> FROM outgoing([[]])
> FLATTEN file.tags as Tag
> WHERE Tag = "#definition" OR Tag = "#theorem" OR Tag = "#MOC" OR Tag = "#proposition" OR Tag = "#axiom"
> GROUP BY Tag
> ```