📚 Chapter 2: Matrix Algebra

A comprehensive mastery guide for undergraduate students covering essential matrix operations, the Invertible Matrix Theorem, subspaces, bases, dimension, and computational methods.

🧮 Matrix Operations

Matrix Addition & Scalar Multiplication

Matrices of the same size can be added entrywise:

$$(A + B)_{ij} = a_{ij} + b_{ij}$$

Scalar multiplication scales every entry:

$$(cA)_{ij} = c \cdot a_{ij}$$
  • Commutative: $A + B = B + A$
  • Associative: $(A + B) + C = A + (B + C)$
  • Distributive: $c(A + B) = cA + cB$

Matrix Multiplication

For matrices $A_{m \times n}$ and $B_{n \times p}$, the product $AB$ is $m \times p$:

$$(AB)_{ij} = \sum_{k=1}^{n} a_{ik} b_{kj} = a_{i1}b_{1j} + a_{i2}b_{2j} + \cdots + a_{in}b_{nj}$$
Example:
$$\begin{bmatrix} 2 & 3 \\ 1 & 4 \end{bmatrix} \begin{bmatrix} 5 & 0 \\ 2 & 1 \end{bmatrix} = \begin{bmatrix} 2(5) + 3(2) & 2(0) + 3(1) \\ 1(5) + 4(2) & 1(0) + 4(1) \end{bmatrix} = \begin{bmatrix} 16 & 3 \\ 13 & 4 \end{bmatrix}$$
  • Associative: $(AB)C = A(BC)$
  • Distributive: $A(B + C) = AB + AC$
  • Not commutative: $AB \neq BA$ in general

Matrix Transpose

The transpose $A^T$ is obtained by reflecting across the main diagonal:

$$(A^T)_{ij} = a_{ji}$$
Example:
$$\text{If } A = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix}, \text{ then } A^T = \begin{bmatrix} 1 & 4 \\ 2 & 5 \\ 3 & 6 \end{bmatrix}$$

Key Properties:

  • $(A^T)^T = A$
  • $(A + B)^T = A^T + B^T$
  • $(cA)^T = cA^T$
  • $(AB)^T = B^T A^T$ (reverse order!)
🎯 Matrix as Linear Combination of Columns For matrix $A = [\mathbf{a}_1 \; \mathbf{a}_2 \; \cdots \; \mathbf{a}_n]$ and vector $\mathbf{x} = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{bmatrix}$: $$A\mathbf{x} = x_1\mathbf{a}_1 + x_2\mathbf{a}_2 + \cdots + x_n\mathbf{a}_n$$

♻️ Matrix Inverse & Elementary Matrices

Matrix Inverse

An $n \times n$ matrix $A$ is invertible (or nonsingular) if there exists a matrix $A^{-1}$ such that:

$$AA^{-1} = A^{-1}A = I_n$$

Computing $2 \times 2$ Inverses

For a $2 \times 2$ matrix:

$$A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}$$

The inverse is:

$$A^{-1} = \frac{1}{ad - bc} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}$$

provided $\det(A) = ad - bc \neq 0$.

General Inverse Algorithm

To find $A^{-1}$ for any invertible matrix:

Step 1: Form the augmented matrix $[A \mid I_n]$
Step 2: Row reduce to $[I_n \mid A^{-1}]$
Step 3: If reduction fails, $A$ is not invertible
Example Process:
$$\left[\begin{array}{cc|cc} 2 & 1 & 1 & 0 \\ 3 & 2 & 0 & 1 \end{array}\right] \rightsquigarrow \left[\begin{array}{cc|cc} 1 & 0 & 2 & -1 \\ 0 & 1 & -3 & 2 \end{array}\right]$$ Therefore: $A^{-1} = \begin{bmatrix} 2 & -1 \\ -3 & 2 \end{bmatrix}$

Inverse Properties

  • Uniqueness: If $A$ is invertible, $A^{-1}$ is unique
  • Product rule: $(AB)^{-1} = B^{-1}A^{-1}$ (reverse order!)
  • Transpose rule: $(A^T)^{-1} = (A^{-1})^T$
  • Scalar rule: $(cA)^{-1} = \frac{1}{c}A^{-1}$ for $c \neq 0$
  • Self-inverse: $(A^{-1})^{-1} = A$

Elementary Matrices

Elementary Matrix Definition An elementary matrix $E$ is obtained by performing exactly one elementary row operation on the identity matrix $I_n$.

Three Types of Elementary Matrices

Type 1: Row Swap $R_i \leftrightarrow R_j$
$$E_1 = \begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end{bmatrix} \quad \text{(swaps rows 1 and 2)}$$
Type 2: Row Scale $R_i \to cR_i$
$$E_2 = \begin{bmatrix} 1 & 0 & 0 \\ 0 & c & 0 \\ 0 & 0 & 1 \end{bmatrix} \quad \text{(scales row 2 by } c \neq 0\text{)}$$
Type 3: Row Addition $R_i \to R_i + cR_j$
$$E_3 = \begin{bmatrix} 1 & 0 & 0 \\ c & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \quad \text{(adds } c \times \text{row 1 to row 2)}$$

Elementary Matrix Properties

  • Left multiplication: $EA$ performs the row operation on $A$
  • All are invertible: Each elementary matrix has an inverse
  • Inverse operations:
    • $E_1^{-1} = E_1$ (swap is its own inverse)
    • $E_2^{-1}$ scales by $\frac{1}{c}$
    • $E_3^{-1}$ adds $-c$ times the row
🎯 Fundamental Connection Theorem If $A$ is invertible, then $A$ can be written as a product of elementary matrices: $$A = E_k E_{k-1} \cdots E_2 E_1$$ This means: $A^{-1} = E_1^{-1} E_2^{-1} \cdots E_{k-1}^{-1} E_k^{-1}$

🔄 The Invertible Matrix Theorem

🎯 The Invertible Matrix Theorem (IMT)

For an $n \times n$ matrix $A$, the following statements are equivalent:

Invertibility & Row Reduction
  • $A$ is an invertible matrix
  • $A$ is row equivalent to $I_n$
  • $A$ has $n$ pivot positions
  • The equation $A\mathbf{x} = \mathbf{0}$ has only the trivial solution
Column Properties
  • The columns of $A$ are linearly independent
  • The columns of $A$ span $\mathbb{R}^n$
  • $\text{Col}(A) = \mathbb{R}^n$
  • The columns of $A$ form a basis for $\mathbb{R}^n$
Linear Transformation
  • The transformation $T(\mathbf{x}) = A\mathbf{x}$ is one-to-one
  • The transformation $T(\mathbf{x}) = A\mathbf{x}$ is onto $\mathbb{R}^n$
  • The equation $A\mathbf{x} = \mathbf{b}$ has exactly one solution for each $\mathbf{b} \in \mathbb{R}^n$
Rank & Null Space
  • $\text{rank}(A) = n$
  • $\dim(\text{Nul}(A)) = 0$
  • $\text{Nul}(A) = \{\mathbf{0}\}$
  • $\det(A) \neq 0$
Transpose & Eigenvalues
  • $A^T$ is an invertible matrix
  • All eigenvalues of $A$ are nonzero
  • $0$ is not an eigenvalue of $A$
🧠 IMT Strategic Implications $$\boxed{\text{Invertible} \iff \text{Full Rank} \iff \text{Linearly Independent Columns} \iff \text{Bijective Transformation}}$$

Use the most convenient condition for each problem context!

Application Example:

To prove a $3 \times 3$ matrix $A$ is invertible, show ANY of these:

$$\text{Row reduce } A \to I_3 \quad \text{OR} \quad A\mathbf{x} = \mathbf{0} \text{ only when } \mathbf{x} = \mathbf{0} \quad \text{OR} \quad \det(A) \neq 0$$

🧭 Subspaces of $\mathbb{R}^n$

Subspace Definition

A subset $H \subseteq \mathbb{R}^n$ is a subspace if it satisfies three axioms:

The Three Subspace Axioms

Axiom 1: Zero vector inclusion $$\mathbf{0} \in H$$
Axiom 2: Closure under addition $$\text{If } \mathbf{u}, \mathbf{v} \in H, \text{ then } \mathbf{u} + \mathbf{v} \in H$$
Axiom 3: Closure under scalar multiplication $$\text{If } \mathbf{u} \in H \text{ and } c \in \mathbb{R}, \text{ then } c\mathbf{u} \in H$$

Equivalent Condition

A subset $H$ is a subspace if and only if:

$$\text{For all } \mathbf{u}, \mathbf{v} \in H \text{ and scalars } c, d: \quad c\mathbf{u} + d\mathbf{v} \in H$$

This single condition captures closure under linear combinations.

Important Subspaces Associated with Matrix $A$

Column Space: $\text{Col}(A)$

$$\text{Col}(A) = \text{Span}\{\mathbf{a}_1, \mathbf{a}_2, \ldots, \mathbf{a}_n\}$$
  • Definition: Set of all linear combinations of columns of $A$
  • Geometric meaning: All vectors that $A$ can "reach"
  • Subspace of: $\mathbb{R}^m$ when $A$ is $m \times n$
  • Connection: Range of the transformation $T(\mathbf{x}) = A\mathbf{x}$
Example:
$$\text{If } A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix}, \text{ then } \text{Col}(A) = \text{Span}\left\{\begin{bmatrix} 1 \\ 3 \\ 5 \end{bmatrix}, \begin{bmatrix} 2 \\ 4 \\ 6 \end{bmatrix}\right\}$$

Null Space: $\text{Nul}(A)$

$$\text{Nul}(A) = \{\mathbf{x} \in \mathbb{R}^n : A\mathbf{x} = \mathbf{0}\}$$
  • Definition: Solution set of the homogeneous equation $A\mathbf{x} = \mathbf{0}$
  • Geometric meaning: All vectors that $A$ maps to zero
  • Subspace of: $\mathbb{R}^n$ when $A$ is $m \times n$
  • Connection: Kernel of the transformation $T(\mathbf{x}) = A\mathbf{x}$
Computing Null Space:

Solve $A\mathbf{x} = \mathbf{0}$ by row reduction, then express solution in parametric form.

🎯 Pivot Column Basis Theorem If $A$ is row reduced to echelon form, then the pivot columns of the original matrix $A$ (not the reduced form) form a basis for $\text{Col}(A)$.
$$\text{Basis for } \text{Col}(A) = \{\text{pivot columns of } A\}$$
⚠️ Common Subspace Examples
  • Trivial subspace: $\{\mathbf{0}\}$
  • All of $\mathbb{R}^n$: The entire space
  • Lines through origin: $\text{Span}\{\mathbf{v}\}$ for nonzero $\mathbf{v}$
  • Planes through origin: $\text{Span}\{\mathbf{u}, \mathbf{v}\}$ for independent $\mathbf{u}, \mathbf{v}$

📏 Bases, Dimension & Rank

Basis Definition

A basis for subspace $H$ is a set $\mathcal{B} = \{\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_p\}$ that satisfies:

$$\mathcal{B} \text{ is linearly independent} \quad \text{AND} \quad \text{Span}(\mathcal{B}) = H$$
  • Minimal spanning set: Can't remove any vector
  • Maximal independent set: Can't add any vector from $H$
  • Unique representation: Every $\mathbf{v} \in H$ has unique coordinates
Standard Basis for $\mathbb{R}^3$:
$$\mathcal{E} = \left\{\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}\right\}$$

Dimension

The dimension of subspace $H$, denoted $\dim(H)$, is:

$$\dim(H) = \text{number of vectors in any basis for } H$$
  • Well-defined: All bases for $H$ have the same number of vectors
  • Geometric interpretation: "Degrees of freedom" in the subspace
  • Convention: $\dim(\{\mathbf{0}\}) = 0$
  • Examples:
    • $\dim(\mathbb{R}^n) = n$
    • Line through origin: dimension 1
    • Plane through origin: dimension 2

Rank of a Matrix

The rank of matrix $A$ is defined as:

$$\text{rank}(A) = \dim(\text{Col}(A))$$

Equivalent definitions:

  • Number of pivot columns in any echelon form of $A$
  • Maximum number of linearly independent columns
  • Maximum number of linearly independent rows
  • Dimension of the range of $T(\mathbf{x}) = A\mathbf{x}$
Computing Rank:

Row reduce $A$ and count pivot positions:

$$\begin{bmatrix} 1 & 2 & 3 \\ 2 & 4 & 7 \\ 1 & 2 & 4 \end{bmatrix} \rightsquigarrow \begin{bmatrix} 1 & 2 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix}$$ Two pivot positions $\Rightarrow \text{rank}(A) = 2$
🌟 The Rank-Nullity Theorem For any $m \times n$ matrix $A$: $$\boxed{\text{rank}(A) + \dim(\text{Nul}(A)) = n}$$
$$\text{(output dimension)} + \text{(kernel dimension)} = \text{(input dimension)}$$
🎯 Basis Tests for Fixed Dimension Let $H$ be a $p$-dimensional subspace, and let $S = \{\mathbf{v}_1, \ldots, \mathbf{v}_p\}$ be a set of $p$ vectors in $H$. Then:
  • Independence Test: If $S$ is linearly independent, then $S$ is a basis for $H$
  • Spanning Test: If $\text{Span}(S) = H$, then $S$ is a basis for $H$
This eliminates the need to verify both conditions separately!

🧱 Advanced Results & Applications

Coordinate Systems

Given basis $\mathcal{B} = \{\mathbf{b}_1, \ldots, \mathbf{b}_n\}$ for $\mathbb{R}^n$, every vector $\mathbf{x}$ has unique representation:

$$\mathbf{x} = c_1\mathbf{b}_1 + c_2\mathbf{b}_2 + \cdots + c_n\mathbf{b}_n$$

The coefficients form the coordinate vector:

$$[\mathbf{x}]_{\mathcal{B}} = \begin{bmatrix} c_1 \\ c_2 \\ \vdots \\ c_n \end{bmatrix}$$

Change of Basis

If $\mathcal{B} = \{\mathbf{b}_1, \ldots, \mathbf{b}_n\}$ is a basis, then:

$$P_{\mathcal{B}} = [\mathbf{b}_1 \; \mathbf{b}_2 \; \cdots \; \mathbf{b}_n]$$

is the change-of-coordinates matrix, and:

$$\mathbf{x} = P_{\mathcal{B}}[\mathbf{x}]_{\mathcal{B}} \quad \text{and} \quad [\mathbf{x}]_{\mathcal{B}} = P_{\mathcal{B}}^{-1}\mathbf{x}$$

Rank Properties

  • Rank of transpose: $\text{rank}(A^T) = \text{rank}(A)$
  • Row and column rank equality: Number of independent rows = Number of independent columns
  • Product inequality: $\text{rank}(AB) \leq \min\{\text{rank}(A), \text{rank}(B)\}$
  • Sum inequality: $\text{rank}(A + B) \leq \text{rank}(A) + \text{rank}(B)$
🎯 Extended Invertible Matrix Theorem For $n \times n$ matrix $A$, add these equivalent conditions to the original IMT:
  • The columns of $A$ form a basis for $\mathbb{R}^n$
  • $\text{Col}(A) = \mathbb{R}^n$
  • $\dim(\text{Col}(A)) = n$
  • $\text{rank}(A) = n$
  • $\text{Nul}(A) = \{\mathbf{0}\}$
  • $\dim(\text{Nul}(A)) = 0$
Practical Strategy Guide
  • To prove invertibility: Show $\text{rank}(A) = n$ or $\det(A) \neq 0$
  • To find bases: Use pivot columns for $\text{Col}(A)$, parametric solutions for $\text{Nul}(A)$
  • To compute dimension: Count basis vectors or use rank-nullity theorem
  • To verify subspace: Check the three axioms or show as span/null space

🧰 LU Decomposition

LU Factorization Concept

LU decomposition expresses matrix $A$ as a product $A = LU$, where:

$$L = \text{lower triangular with 1's on diagonal}, \quad U = \text{upper triangular}$$

This reorganizes Gaussian elimination for computational efficiency.

Detailed Example: $4 \times 5$ Matrix

$$A = \begin{bmatrix} 2 & 4 & -1 & 5 & -2 \\ -4 & -5 & 3 & -8 & 1 \\ 2 & -5 & -4 & 1 & 8 \\ -6 & 0 & 7 & -3 & 1 \end{bmatrix}$$

Step 1: Initialize

Set $L = I_4$ and $U = A$:

$$L = \begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end{bmatrix}$$

Step 2: First Column

Using pivot $u_{11} = 2$, compute multipliers:

$$\ell_{21} = \frac{-4}{2} = -2, \quad \ell_{31} = \frac{2}{2} = 1, \quad \ell_{41} = \frac{-6}{2} = -3$$

Eliminate below pivot in $U$.

Step 3: Second Column

After first elimination:

$$U = \begin{bmatrix} 2 & 4 & -1 & 5 & -2 \\ 0 & 3 & 1 & 2 & -3 \\ 0 & -9 & -3 & -4 & 10 \\ 0 & 12 & 4 & 12 & -5 \end{bmatrix}$$

Using pivot $u_{22} = 3$:

$$\ell_{32} = \frac{-9}{3} = -3, \quad \ell_{42} = \frac{12}{3} = 4$$

Step 4: Continue Process

After second elimination, continue with remaining columns using the same multiplier-and-eliminate pattern.

Final LU Factorization

$$L = \begin{bmatrix} 1 & 0 & 0 & 0 \\ -2 & 1 & 0 & 0 \\ 1 & -3 & 1 & 0 \\ -3 & 4 & 0 & 1 \end{bmatrix}, \quad U = \begin{bmatrix} 2 & 4 & -1 & 5 & -2 \\ 0 & 3 & 1 & 2 & -3 \\ 0 & 0 & 0 & 2 & 1 \\ 0 & 0 & 0 & 0 & 5 \end{bmatrix}$$
Verification:

Check that $LU = A$ by computing the matrix product.

Why LU Decomposition Matters
  • Efficient solving: $A\mathbf{x} = \mathbf{b}$ becomes two triangular systems: $L\mathbf{y} = \mathbf{b}$, then $U\mathbf{x} = \mathbf{y}$
  • Multiple right-hand sides: Solve many systems with same $A$ efficiently
  • Matrix inversion: Use LU to find $A^{-1}$ column by column
  • Determinant computation: $\det(A) = \det(L) \cdot \det(U) = \det(U)$
🔍 LU Algorithm Summary $$\boxed{\text{Forward elimination with multiplier tracking} \rightarrow L \text{ stores multipliers, } U \text{ stores result}}$$

The $L$ matrix "remembers" how we eliminated, $U$ matrix is the final echelon form.

Mastery Checklist & Review

🎯 Essential Skills Verification

Matrix Operations

  • Compute $A + B$, $cA$, and $AB$ with dimension checks
  • Apply transpose rules: $(AB)^T = B^T A^T$
  • Use matrix multiplication for linear combinations

Matrix Inverses

  • Find $A^{-1}$ using $[A \mid I] \rightarrow [I \mid A^{-1}]$
  • Apply inverse properties: $(AB)^{-1} = B^{-1}A^{-1}$
  • Recognize when matrices are not invertible

Invertible Matrix Theorem

  • Apply any IMT condition to test invertibility
  • Connect linear independence, spanning, and transformations
  • Use rank conditions strategically

Subspaces & Bases

  • Verify subspace axioms or identify as $\text{Col}/\text{Nul}$
  • Find bases using pivot columns or parametric solutions
  • Apply rank-nullity theorem for dimension calculations
🎓 Master Strategy: The IMT Web
$$\begin{array}{c} \text{Matrix Operations} \\ \downarrow \\ \text{Row Reduction} \rightarrow \boxed{\text{IMT}} \leftarrow \text{Determinants} \\ \downarrow \quad \quad \quad \quad \quad \quad \uparrow \\ \text{Subspaces} \rightarrow \text{Bases \& Dimension} \end{array}$$

The Invertible Matrix Theorem connects ALL major concepts. Master the web of relationships!

🔄 Study Cycle Recommendation
  1. Computational fluency: Practice matrix operations until automatic
  2. Conceptual connections: Link IMT conditions to geometric intuition
  3. Problem recognition: Identify which tools fit each problem type
  4. Strategic application: Choose the most efficient IMT condition
🎯 Solve problem Strategy
  • For invertibility questions: Use the most convenient IMT condition (often rank or row reduction)
  • For subspace problems: Verify axioms or express as column/null space
  • For basis questions: Use pivot columns or parametric vector forms
  • For dimension problems: Count basis vectors or apply rank-nullity
  • Always verify: Check dimensions match for operations and final answers