3.6 Computer tasks

  1. Finding the eigenvalues and eigenvectors of a matrix is easy in R.
A=matrix(c(3,1,1,6),nrow=2,byrow=TRUE)    # use a to define a matrix A
Eig=eigen(A)                     # the eigenvalues and eigenvectors of A
                                   # are stored in the list Eig
lambda=Eig$values                # extract the eigenvalues from Eig and
                                   # store in the vector e
lambda                           # you should see the eigenvalues in
## [1] 6.302776 2.697224
                                   # descending order
Q=Eig$vectors                    # extract the eigenvectors from Eig and
                                   # store then in the columns of Q

The spectral decomposition of \(\mathbf A\) is \[\mathbf A= \mathbf Q\boldsymbol \Lambda\mathbf Q^\top\] Let’s check this in R (noting as always that there may be some numerical errors)

Q%*%diag(lambda)%*%t(Q)          # reconstruct A,
##      [,1] [,2]
## [1,]    3    1
## [2,]    1    6
                                   # where t(Q) gives the transpose of Q

Since A is positive definite, we can calculate the symmetric, positive definite square root of A.

Asqrt=Q%*%diag(lambda**0.5)%*%t(Q) # lambda**0.5 contains the square roots
Asqrt%*%Asqrt                      # it is seen that A is recovered
##      [,1] [,2]
## [1,]    3    1
## [2,]    1    6
  • Instead of using the full eigendecomposition for \(\mathbf A\), try truncating it and using just a single eigenvalue and eigenvector, i.e., compute \[\mathbf A' = \lambda_1 \mathbf q_1 \mathbf q_1^\top\]
  • Compute the difference between \(\mathbf A\) and \(\mathbf A'\) using the 2-norm and the Frobenius norm.
  1. The singular value decomposition can be computed in R using the command svd. Let \(\mathbf X\) be the four numerical variables in the iris dataset with the column mean removed
n=150
H=diag(rep(1,n))-rep(1,n)%*%t(rep(1,n))/n   # calculate the centering matrix H
X=H%*% as.matrix(iris[,1:4])
# This can also be done using the command
# sweep(iris[,1:4], 2, colMeans(iris[,1:4]))  # do you understand why?
  • Compute the SVD of \(\mathbf X\) in R and report its singular values.

  • Does R report the full or compact SVD?

  • Check that \(\mathbf X\mathbf v= \sigma \mathbf u\).

  • Compute the best rank-1, rank-2, and rank-3 approximations to \(\mathbf X\), and report the 2-norm and Frobenious norm for these approximations

  • Compute the eigenvalues of \(\mathbf X^\top \mathbf X\). How do these relate to the singular values? How does \(\mathbf X^\top \mathbf X\) relate to the sample covariance matrix of the iris data? How do the singular values relate to the eigenvalues of the covariance matrix?

  • Let \(\mathbf S\) be the sample covariance matrix of the iris dataset. What vector \(\mathbf x\) with \(||\mathbf x||=1\) maximizes \(\mathbf x^\top \mathbf S\mathbf x\)?

  1. Choose an few images from the USC-SIPI Image Database and repeat the image compression example from the notes. Which type of images compress well do you think?

  2. We won’t discuss how the SVD is computed in practice, but there are a variety of approaches that can be used. Try the following iterative approach for computing the first singular vectors:

X <- as.matrix(iris[,1:4])
v <- rnorm(dim(X)[2])

for (iter in 1:500){
  u <- X %*%v
  u <- u/sqrt(sum(u^2)) 
  v <- t(X) %*%u
  v <- v/sqrt(sum(v^2)) 
}
X.svd <- svd(X)
X.svd$v[,1]/v
X.svd$u[,1]/u