🧮 Matrix Operations
Matrix Addition & Scalar Multiplication
Matrices of the same size can be added entrywise:
Scalar multiplication scales every entry:
- Commutative: $A + B = B + A$
- Associative: $(A + B) + C = A + (B + C)$
- Distributive: $c(A + B) = cA + cB$
Matrix Multiplication
For matrices $A_{m \times n}$ and $B_{n \times p}$, the product $AB$ is $m \times p$:
Example:
$$\begin{bmatrix} 2 & 3 \\ 1 & 4 \end{bmatrix} \begin{bmatrix} 5 & 0 \\ 2 & 1 \end{bmatrix} = \begin{bmatrix} 2(5) + 3(2) & 2(0) + 3(1) \\ 1(5) + 4(2) & 1(0) + 4(1) \end{bmatrix} = \begin{bmatrix} 16 & 3 \\ 13 & 4 \end{bmatrix}$$- Associative: $(AB)C = A(BC)$
- Distributive: $A(B + C) = AB + AC$
- Not commutative: $AB \neq BA$ in general
Matrix Transpose
The transpose $A^T$ is obtained by reflecting across the main diagonal:
Example:
$$\text{If } A = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix}, \text{ then } A^T = \begin{bmatrix} 1 & 4 \\ 2 & 5 \\ 3 & 6 \end{bmatrix}$$Key Properties:
- $(A^T)^T = A$
- $(A + B)^T = A^T + B^T$
- $(cA)^T = cA^T$
- $(AB)^T = B^T A^T$ (reverse order!)
♻️ Matrix Inverse & Elementary Matrices
Matrix Inverse
An $n \times n$ matrix $A$ is invertible (or nonsingular) if there exists a matrix $A^{-1}$ such that:
Computing $2 \times 2$ Inverses
For a $2 \times 2$ matrix:
The inverse is:
provided $\det(A) = ad - bc \neq 0$.
General Inverse Algorithm
To find $A^{-1}$ for any invertible matrix:
Example Process:
$$\left[\begin{array}{cc|cc} 2 & 1 & 1 & 0 \\ 3 & 2 & 0 & 1 \end{array}\right] \rightsquigarrow \left[\begin{array}{cc|cc} 1 & 0 & 2 & -1 \\ 0 & 1 & -3 & 2 \end{array}\right]$$ Therefore: $A^{-1} = \begin{bmatrix} 2 & -1 \\ -3 & 2 \end{bmatrix}$Inverse Properties
- Uniqueness: If $A$ is invertible, $A^{-1}$ is unique
- Product rule: $(AB)^{-1} = B^{-1}A^{-1}$ (reverse order!)
- Transpose rule: $(A^T)^{-1} = (A^{-1})^T$
- Scalar rule: $(cA)^{-1} = \frac{1}{c}A^{-1}$ for $c \neq 0$
- Self-inverse: $(A^{-1})^{-1} = A$
Elementary Matrices
Three Types of Elementary Matrices
Type 1: Row Swap $R_i \leftrightarrow R_j$
$$E_1 = \begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end{bmatrix} \quad \text{(swaps rows 1 and 2)}$$Type 2: Row Scale $R_i \to cR_i$
$$E_2 = \begin{bmatrix} 1 & 0 & 0 \\ 0 & c & 0 \\ 0 & 0 & 1 \end{bmatrix} \quad \text{(scales row 2 by } c \neq 0\text{)}$$Type 3: Row Addition $R_i \to R_i + cR_j$
$$E_3 = \begin{bmatrix} 1 & 0 & 0 \\ c & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \quad \text{(adds } c \times \text{row 1 to row 2)}$$Elementary Matrix Properties
- Left multiplication: $EA$ performs the row operation on $A$
- All are invertible: Each elementary matrix has an inverse
- Inverse operations:
- $E_1^{-1} = E_1$ (swap is its own inverse)
- $E_2^{-1}$ scales by $\frac{1}{c}$
- $E_3^{-1}$ adds $-c$ times the row
🔄 The Invertible Matrix Theorem
🎯 The Invertible Matrix Theorem (IMT)
For an $n \times n$ matrix $A$, the following statements are equivalent:
Invertibility & Row Reduction
- $A$ is an invertible matrix
- $A$ is row equivalent to $I_n$
- $A$ has $n$ pivot positions
- The equation $A\mathbf{x} = \mathbf{0}$ has only the trivial solution
Column Properties
- The columns of $A$ are linearly independent
- The columns of $A$ span $\mathbb{R}^n$
- $\text{Col}(A) = \mathbb{R}^n$
- The columns of $A$ form a basis for $\mathbb{R}^n$
Linear Transformation
- The transformation $T(\mathbf{x}) = A\mathbf{x}$ is one-to-one
- The transformation $T(\mathbf{x}) = A\mathbf{x}$ is onto $\mathbb{R}^n$
- The equation $A\mathbf{x} = \mathbf{b}$ has exactly one solution for each $\mathbf{b} \in \mathbb{R}^n$
Rank & Null Space
- $\text{rank}(A) = n$
- $\dim(\text{Nul}(A)) = 0$
- $\text{Nul}(A) = \{\mathbf{0}\}$
- $\det(A) \neq 0$
Transpose & Eigenvalues
- $A^T$ is an invertible matrix
- All eigenvalues of $A$ are nonzero
- $0$ is not an eigenvalue of $A$
Use the most convenient condition for each problem context!
Application Example:
To prove a $3 \times 3$ matrix $A$ is invertible, show ANY of these:
🧭 Subspaces of $\mathbb{R}^n$
Subspace Definition
A subset $H \subseteq \mathbb{R}^n$ is a subspace if it satisfies three axioms:
The Three Subspace Axioms
Equivalent Condition
A subset $H$ is a subspace if and only if:
This single condition captures closure under linear combinations.
Important Subspaces Associated with Matrix $A$
Column Space: $\text{Col}(A)$
- Definition: Set of all linear combinations of columns of $A$
- Geometric meaning: All vectors that $A$ can "reach"
- Subspace of: $\mathbb{R}^m$ when $A$ is $m \times n$
- Connection: Range of the transformation $T(\mathbf{x}) = A\mathbf{x}$
Example:
$$\text{If } A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix}, \text{ then } \text{Col}(A) = \text{Span}\left\{\begin{bmatrix} 1 \\ 3 \\ 5 \end{bmatrix}, \begin{bmatrix} 2 \\ 4 \\ 6 \end{bmatrix}\right\}$$Null Space: $\text{Nul}(A)$
- Definition: Solution set of the homogeneous equation $A\mathbf{x} = \mathbf{0}$
- Geometric meaning: All vectors that $A$ maps to zero
- Subspace of: $\mathbb{R}^n$ when $A$ is $m \times n$
- Connection: Kernel of the transformation $T(\mathbf{x}) = A\mathbf{x}$
Computing Null Space:
Solve $A\mathbf{x} = \mathbf{0}$ by row reduction, then express solution in parametric form.
- Trivial subspace: $\{\mathbf{0}\}$
- All of $\mathbb{R}^n$: The entire space
- Lines through origin: $\text{Span}\{\mathbf{v}\}$ for nonzero $\mathbf{v}$
- Planes through origin: $\text{Span}\{\mathbf{u}, \mathbf{v}\}$ for independent $\mathbf{u}, \mathbf{v}$
📏 Bases, Dimension & Rank
Basis Definition
A basis for subspace $H$ is a set $\mathcal{B} = \{\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_p\}$ that satisfies:
- Minimal spanning set: Can't remove any vector
- Maximal independent set: Can't add any vector from $H$
- Unique representation: Every $\mathbf{v} \in H$ has unique coordinates
Standard Basis for $\mathbb{R}^3$:
$$\mathcal{E} = \left\{\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}\right\}$$Dimension
The dimension of subspace $H$, denoted $\dim(H)$, is:
- Well-defined: All bases for $H$ have the same number of vectors
- Geometric interpretation: "Degrees of freedom" in the subspace
- Convention: $\dim(\{\mathbf{0}\}) = 0$
- Examples:
- $\dim(\mathbb{R}^n) = n$
- Line through origin: dimension 1
- Plane through origin: dimension 2
Rank of a Matrix
The rank of matrix $A$ is defined as:
Equivalent definitions:
- Number of pivot columns in any echelon form of $A$
- Maximum number of linearly independent columns
- Maximum number of linearly independent rows
- Dimension of the range of $T(\mathbf{x}) = A\mathbf{x}$
Computing Rank:
Row reduce $A$ and count pivot positions:
$$\begin{bmatrix} 1 & 2 & 3 \\ 2 & 4 & 7 \\ 1 & 2 & 4 \end{bmatrix} \rightsquigarrow \begin{bmatrix} 1 & 2 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix}$$ Two pivot positions $\Rightarrow \text{rank}(A) = 2$- Independence Test: If $S$ is linearly independent, then $S$ is a basis for $H$
- Spanning Test: If $\text{Span}(S) = H$, then $S$ is a basis for $H$
🧱 Advanced Results & Applications
Coordinate Systems
Given basis $\mathcal{B} = \{\mathbf{b}_1, \ldots, \mathbf{b}_n\}$ for $\mathbb{R}^n$, every vector $\mathbf{x}$ has unique representation:
The coefficients form the coordinate vector:
Change of Basis
If $\mathcal{B} = \{\mathbf{b}_1, \ldots, \mathbf{b}_n\}$ is a basis, then:
is the change-of-coordinates matrix, and:
Rank Properties
- Rank of transpose: $\text{rank}(A^T) = \text{rank}(A)$
- Row and column rank equality: Number of independent rows = Number of independent columns
- Product inequality: $\text{rank}(AB) \leq \min\{\text{rank}(A), \text{rank}(B)\}$
- Sum inequality: $\text{rank}(A + B) \leq \text{rank}(A) + \text{rank}(B)$
- The columns of $A$ form a basis for $\mathbb{R}^n$
- $\text{Col}(A) = \mathbb{R}^n$
- $\dim(\text{Col}(A)) = n$
- $\text{rank}(A) = n$
- $\text{Nul}(A) = \{\mathbf{0}\}$
- $\dim(\text{Nul}(A)) = 0$
- To prove invertibility: Show $\text{rank}(A) = n$ or $\det(A) \neq 0$
- To find bases: Use pivot columns for $\text{Col}(A)$, parametric solutions for $\text{Nul}(A)$
- To compute dimension: Count basis vectors or use rank-nullity theorem
- To verify subspace: Check the three axioms or show as span/null space
🧰 LU Decomposition
LU Factorization Concept
LU decomposition expresses matrix $A$ as a product $A = LU$, where:
This reorganizes Gaussian elimination for computational efficiency.
Detailed Example: $4 \times 5$ Matrix
Step 1: Initialize
Set $L = I_4$ and $U = A$:
Step 2: First Column
Using pivot $u_{11} = 2$, compute multipliers:
Eliminate below pivot in $U$.
Step 3: Second Column
After first elimination:
Using pivot $u_{22} = 3$:
Step 4: Continue Process
After second elimination, continue with remaining columns using the same multiplier-and-eliminate pattern.
Final LU Factorization
Verification:
Check that $LU = A$ by computing the matrix product.
- Efficient solving: $A\mathbf{x} = \mathbf{b}$ becomes two triangular systems: $L\mathbf{y} = \mathbf{b}$, then $U\mathbf{x} = \mathbf{y}$
- Multiple right-hand sides: Solve many systems with same $A$ efficiently
- Matrix inversion: Use LU to find $A^{-1}$ column by column
- Determinant computation: $\det(A) = \det(L) \cdot \det(U) = \det(U)$
The $L$ matrix "remembers" how we eliminated, $U$ matrix is the final echelon form.
✅ Mastery Checklist & Review
Matrix Operations
- Compute $A + B$, $cA$, and $AB$ with dimension checks
- Apply transpose rules: $(AB)^T = B^T A^T$
- Use matrix multiplication for linear combinations
Matrix Inverses
- Find $A^{-1}$ using $[A \mid I] \rightarrow [I \mid A^{-1}]$
- Apply inverse properties: $(AB)^{-1} = B^{-1}A^{-1}$
- Recognize when matrices are not invertible
Invertible Matrix Theorem
- Apply any IMT condition to test invertibility
- Connect linear independence, spanning, and transformations
- Use rank conditions strategically
Subspaces & Bases
- Verify subspace axioms or identify as $\text{Col}/\text{Nul}$
- Find bases using pivot columns or parametric solutions
- Apply rank-nullity theorem for dimension calculations
The Invertible Matrix Theorem connects ALL major concepts. Master the web of relationships!
- Computational fluency: Practice matrix operations until automatic
- Conceptual connections: Link IMT conditions to geometric intuition
- Problem recognition: Identify which tools fit each problem type
- Strategic application: Choose the most efficient IMT condition
- For invertibility questions: Use the most convenient IMT condition (often rank or row reduction)
- For subspace problems: Verify axioms or express as column/null space
- For basis questions: Use pivot columns or parametric vector forms
- For dimension problems: Count basis vectors or apply rank-nullity
- Always verify: Check dimensions match for operations and final answers