3.7 Exercises

  1. There is a great Twitter thread on the SVD by Daniella Witten. Read it here.

  2. Let \(\boldsymbol{\Sigma}\) be an arbitrary covariance matrix.
    • Show \(\boldsymbol{\Sigma}\) is symmetric and positive semi-definite.
    • Give examples of both singular and non-singular covariance matrices.
    • What condition must the eigenvalues of a non-singular covariance matrix satisfy?



  1. Compute, by hand (but check your answer in R), the singular value decomposition (full and compact) of the following matrices.
    • \(\left(\begin{array}{cc}2&0\\0&-1\end{array} \right)\)
    • \(\left(\begin{array}{cc}1&0\\0&0\\0&0\end{array} \right)\)
  2. Let \[\mathbf X=\left(\begin{array}{cc}1&1\\0&1\\1&0\end{array} \right)\]
    The eigen-decomposition of \(\mathbf X^\top \mathbf X\) is \[\mathbf X^\top \mathbf X=\frac{1}{\sqrt{2}}\left(\begin{array}{cc}1&-1\\1&1\end{array} \right) \left(\begin{array}{cc}3&0\\0&1\end{array} \right)\frac{1}{\sqrt{2}}\left(\begin{array}{cc}1&-1\\1&1\end{array} \right)^\top \] Use this fact to compute answer the following questions:
    • What are the singular values of \(\mathbf X\)?
    • What are the right singular vectors of \(\mathbf X\)?
    • What are the left singular vectors of \(\mathbf X\)?
    • Give the compact SVD of \(\mathbf X\). Check your answer, noting that the singular vectors are only specified up to multiplication by \(-1\)
    • Can you compute the full SVD of \(\mathbf X\)?
    • What is the eigen-decomposition of \(\mathbf X\mathbf X^\top\)?
    • Find a generalised inverse of matrix \(\mathbf X\).
  1. The SVD can be used to solve linear systems of the form \[\mathbf A\mathbf x= \mathbf y\] where \(\mathbf A\) is a \(n\times p\) matrix, with compact SVD \(\mathbf A= \mathbf U\boldsymbol{\Sigma}\mathbf V^\top\).

    • If \(\mathbf A\) is a square invertible matrix, show that \[\tilde{\mathbf x} = \mathbf V\boldsymbol{\Sigma}^{-1} \mathbf U^\top \mathbf y\] is the unique solution to \(\mathbf A\mathbf x= \mathbf y\), i.e., show that \(\mathbf A^{-1} = \mathbf V\boldsymbol{\Sigma}^{-1} \mathbf U^\top\).

    • If \(\mathbf A\) is not a square matrix, then \(\mathbf A^+ = \mathbf V\boldsymbol{\Sigma}^{-1} \mathbf U^\top\) is a generalized inverse (not a true inverse) matrix, and \(\tilde{\mathbf x}=\mathbf A^+\mathbf y\) is still a useful quantity to consider as we shall now see. Let \(\mathbf A=\left(\begin{array}{cc}1&1\\0&1\\1&0\end{array} \right)\) and \(\mathbf y= \left(\begin{array}{c}2\\1\\1\end{array} \right)\). Then \(\mathbf A\mathbf x=\mathbf y\) is an over-determined system in that there are 3 equations in 2 unknowns. Compute \(\tilde{\mathbf x}=\mathbf A^+\mathbf y\). Is this a solution to the equation?

    Note that you computed the svd for \(\mathbf A\) in Q2.

    • Now suppose \(\mathbf y= \left(\begin{array}{c}1\\-1\\1\end{array} \right)\). There is no solution to \(\mathbf A\mathbf x=\mathbf y\) in this case as \(\mathbf y\) is not in the column space of \(\mathbf A\). Prove that \({\tilde{\mathbf x}} = \mathbf A^+\mathbf y\) solves the least squares problem \[\tilde{\mathbf x} = \arg\min_{\mathbf x}||\mathbf y- \mathbf A\mathbf x||_2.\]

    Hint: You can either do this directly for this problem, or you can show that the least squares solution \((\mathbf A^\top \mathbf A)^{-1}\mathbf A^\top \mathbf y=\tilde{\mathbf x}\).

  2. Consider the system \[\mathbf B\mathbf x= \mathbf y\mbox{ with }\mathbf B=\left(\begin{array}{ccc}1&0&1\\1&1&0\end{array} \right),\mathbf y= \left(\begin{array}{c}1\\1\end{array} \right).\] This is an underdetermined system, as there are 2 equations in 3 unknowns, and so there are an infinite number of solutions for \(\mathbf x\) in this case.

    • Find the full SVD for \(\mathbf B=\mathbf U\boldsymbol{\Sigma}\mathbf V^\top\) (noting that \(\mathbf B=\mathbf A^\top\) for \(\mathbf A\) from the previous question).
    • Compute \(\tilde{\mathbf x}=\mathbf B^+\mathbf y\), check it is a solution to the equation, and explain why \[\tilde{\mathbf x}= \sum_{i=1}^r \mathbf v_i \frac{\mathbf u_i^\top \mathbf y}{\sigma_i}\] in general, where \(r\leq max(n,p)\) is the rank of \(\mathbf B\), and write out \(\tilde{\mathbf x}\) explicitly in this form for the given \(\mathbf B\).
    • Consider \(\mathbf x\) of the form \[\mathbf x= \tilde{\mathbf x} + \sum_{i=r+1}^n \alpha_i \mathbf v_i\] and explain why any \(\mathbf x\) of this form is also a solution to \(\mathbf B\mathbf x=\mathbf y\). Thus write out all possible solutions of the equation.
    • Prove that \(\tilde{\mathbf x}\) is the solution with minimum norm, i.e., \(||\tilde{\mathbf x}||_2 \leq ||\mathbf x||_2\). Hint \(\mathbf v_1, \ldots, \mathbf v_p\) form a complete orthonormal basis for \(\mathbb{R}^p\).
  3. Prove proposition 3.4, namely that the eigenvalues of projection matrices are either 0 or 1. Show that \({\bf 1}_n\) is an eigenvector of \(\mathbf H\). What is the corresponding eigenvalue? What are the remaining eigenvalues equal to?