# Linear Algebra II

April, 2021

An introduction to intermediate level topics in Linear Algebra by a series of probing questions and answers. Specifically the following topics are covered:

• Eigenvalues & Eigenvectors
• Norms, Inner Products and Projections
• Spectral & Singular Value Decomposition Theorems
• Applications of SVD and Best Fit Subspaces

All materials including videos, notes, assignments are provided here. The textbooks used are also openly available. The course was designed for computer science and electronics undergraduate engineering students at IIIT Hyderabad.

## Prerequisites

Assumes that Linear Algebra I is covered. Specifically assumes knowledge of the following topics:

• Solutions to Linear Equations, Echelon Forms, Gaussian Elimination
• Field, Vector Space Defintions, Real, Complex, Finite Field based Vector Spaces
• Linear Transformation, Matrix, Rank, Inverse, Transpose
• Change of Basis and effect on Matrix of the Linear Transformation
• Determinants

## Schedule

### Meetings

##### Live Lectures

From 30th March to 4th May, 2021, Tuesdays, Thursdays and Saturdays

• 10-11 AM for Batch 2
• 12-01 PM for Batch 1
##### Tutorials

Weekly tutorials conducted by TAs in smaller groups as per times fixed.

##### Office Hours

I will be available on Teams during the timings given bellow, for clearing any doubts. You can sent a direct message to me for joining.

• Monday, 530-730PM.
• Thursday, 330-530PM.

Students are expected to spent atleast 12 hrs per week. Roughly

• 3+1 hrs attending lectures and tutorials
• 4 hrs reading textbooks, references etc
• 4 hrs solving assignments, quizes etc

### Evaluations

• 4 Light Quizzes (Weekly)
• 3 Assignments
• 2 Deep Quizzes (17th April, 4th May)

## Textbook and References

### Textbook

The resources provided are licenced under Creative Commons/Open Licences and hence downloadable.

## Lecture Topics

### 1. Eigenvalues & Diagonalization

#### 1.1 Linear Algebra & Random Walks

Recalling Basics | Function Spaces | Random Walk on Graphs

• Notes
• Section 2.9 in [WKN]

#### 1.2 Eigenvectors and Eigenvalues

Definitions | Characteristic Polynomial | Examples

• Notes
• Section 3.3 for Eigenvalues, Section 2.9 for Random Walks on Graphs in [WKN].
• Chapter 12 in [CDTW].

#### 1.3 Diagonalization

Eigenvector Basis and Powering | Multiplicities

• Notes
• Section 3.3 in [WKN]
• Chapter 13 in [CDTW]

#### Assignment 1

Submit by 13th April

• Notes
• Practice Problems:new
• Show that for any matrix $M \in \mathbb R^{n \times n}$ with eigenvalue $\lambda \in \mathbb R$, $$\text{geometric_multiplicity}(\lambda, M) = \text{geometric_multiplicity}(\lambda, M^t).$$
• Let $M$ be block diagonal with blocks $M_1,\ldots,M_k$ (all square matrices). Show that: $$\text{geometric_multiplicity}(\lambda, M) = \sum_{i=1}^k \text{geometric_multiplicity}(\lambda, M_i).$$

### 2. Norms & Inner Products

#### 2.1 Norms & Inner Products

Jordan Form | Norms | Distance | Inner Product | Complex Case

#### 2.2 Orthonormal Vectors

Orthogonal & Orthonormal Vectors | Gram-Schmidt Orthogonalization

#### 2.3 Projection and Orthogonal Complement

Subspace Projections | Orthogonal Complements | Fitting with Errors

• Notes
• Section 14.6 in [CDTW]
• Section 8.1 in [KTW]

#### 2.4 Least Squares Fitting

Best fit vector on a subspace | Least Squares Fitting Equation

#### Assignment 2

Submit by 26th April

#### 3.1 Symmetric Matrices and Properties

Eigenvalues and eigenvectors of Symmetric Matices | Spectral Theorem

• Notes
• Chapter 15 in [CDTW]
• Solve
• Question 3, 5 in Review Problems 15.1 in [CDTW]
• If $P$ is the change of basis matrix for changing coordinates from standard basis to another orthonormal basis, then columns of $P$ are orthonormal.
• If columns of $P$ are orthonormal then rows of $P$ are also orthonormal.

#### 3.2 Spectral Decomposition Theorem

Spectral Theorem | Spectral Decomposition

• Notes
• Section 10.3 in [WKN]
• Chapter 15 in [CDTW]
• Solve
• Let $v_1,\cdots, v_n$ be a basis for an $n$ dimensional vector space $V$ over some field $\mathbb F$ and let $M,M' \in \mathbb F^{n\times n}$ be matrices. Show that if $$\forall i \in \{1,\ldots, n\},\qquad Mv_i = M’v_i$$ then $M=M'$.
• Consider the $n$ dimenstional complex vector space $\mathbb C^n$. Is $\mathbb R^n$ a subpace of $\mathbb C^n$?
• We know that $\mathbb R^n$ is a subset of $\mathbb C^n$. Suppose $v_1,\ldots, v_n \in \mathbb R^n$ be orthonormal vectors. Can they form a basis for $\mathbb C^n$?

#### 3.3 Singular Value Decomposition

Spectral Theorem for Complex Spaces | Singular Value Decomposition

• Notes
• Section 17.2 in [CDTW]
• Section 8.6 in [WKN]

### 4. Applications

#### 4.1 SVD & Applications

Principal Component Analysis | Applications in Data Science

• Notes
• Section 8.11 in [WKN]
• Code
• Explore

#### Assignment 3

See Section 6 in Problems. Submit by 7th May.

#### 4.2 SVD & Best Fit Subspaces

• Notes
• Suppose $W$ is a $k$ dimensional subspace of $\mathbb R^n$ and $v_1,\ldots, v_{k-1} \in \mathbb R^n$ be orthonormal vectors (may or may not be in $W$). Show that there exists a vector $w\in W$ that is nonzero, such that $w \in \text{span}(v_1,\ldots, v_{k-1})^\perp$. That is $$\exists w \in W \cap \text{span}(v_1,\ldots, v_{k-1})^\perp \text{ such that } w \neq 0.$$
• Suppose $M \in \mathbb R^{n \times n}$ is invertible with singular value decomposition: $$M = \sum_{i=1}^n s_i u_iv_i^T \qquad \text{ where } s_i \in \mathbb R^{+}, u_i,v_i \in \mathbb{R}^{n \times 1}.$$ Let $M' = \sum_{i=1}^n s^{-1}_i v_iu_i^T$. Show that $M' = M^{-1}$.