Chapter 4: Vector Spaces
Comprehensive Summary

This chapter introduces the fundamental concept of vector spaces and their subspaces, establishing the abstract framework that unifies many different mathematical structures. Understanding these concepts is crucial for advanced linear algebra and its applications.

Table of Contents

Section 4.1: Vector Spaces and Subspaces

Vector Space Definition

A vector space $V$ over a field $\mathbb{F}$ (usually $\mathbb{R}$ or $\mathbb{C}$) is a set equipped with two operations:

  • Vector addition: $\mathbf{u} + \mathbf{v} \in V$
  • Scalar multiplication: $\alpha \mathbf{v} \in V$

satisfying 10 axioms (5 for addition, 5 for scalar multiplication).

Key Properties

Addition Axioms:

  1. Closure: $\mathbf{u} + \mathbf{v} \in V$
  2. Commutativity: $\mathbf{u} + \mathbf{v} = \mathbf{v} + \mathbf{u}$
  3. Associativity: $(\mathbf{u} + \mathbf{v}) + \mathbf{w} = \mathbf{u} + (\mathbf{v} + \mathbf{w})$
  4. Identity: $\exists$ zero vector $\mathbf{0}$ such that $\mathbf{v} + \mathbf{0} = \mathbf{v}$
  5. Inverse: $\exists$ $(-\mathbf{v})$ such that $\mathbf{v} + (-\mathbf{v}) = \mathbf{0}$

Scalar Multiplication Axioms:

  1. Closure: $\alpha \mathbf{v} \in V$ for all scalars $\alpha$
  2. Distributivity: $\alpha(\mathbf{u} + \mathbf{v}) = \alpha\mathbf{u} + \alpha\mathbf{v}$
  3. Distributivity: $(\alpha + \beta)\mathbf{v} = \alpha\mathbf{v} + \beta\mathbf{v}$
  4. Associativity: $\alpha(\beta\mathbf{v}) = (\alpha\beta)\mathbf{v}$
  5. Identity: $1 \cdot \mathbf{v} = \mathbf{v}$

Important Examples

  • $\mathbb{R}^n$: $n$-dimensional Euclidean space
  • $\mathcal{P}_n$: Space of polynomials of degree $\leq n$
  • $\mathbb{M}_{m\times n}$: Space of $m\times n$ matrices
  • $C[a,b]$: Space of continuous functions on $[a,b]$

Subspaces

A subspace $H$ of vector space $V$ is a subset that is itself a vector space under the same operations.

Subspace Test (3 conditions):

  1. $\mathbf{0} \in H$ (contains zero vector)
  2. Closed under addition: $\mathbf{u}, \mathbf{v} \in H \Rightarrow \mathbf{u} + \mathbf{v} \in H$
  3. Closed under scalar multiplication: $\mathbf{v} \in H \Rightarrow \alpha\mathbf{v} \in H$

Key Insight: Conditions 2 and 3 can be combined: $\mathbf{u}, \mathbf{v} \in H$ and $\alpha, \beta$ scalars $\Rightarrow \alpha\mathbf{u} + \beta\mathbf{v} \in H$

Section 4.2: Null Spaces, Column Spaces, Row Spaces, and Linear Transformations

The Four Fundamental Subspaces

1. Column Space (Range)

Definition: $\text{Col}(A) = \{\mathbf{b} \in \mathbb{R}^m : A\mathbf{x} = \mathbf{b} \text{ has a solution}\}$

  • The set of all linear combinations of columns of $A$
  • Represents all "reachable outputs" of the transformation
  • Lives in the output space $\mathbb{R}^m$

Key Property: $\mathbf{b} \in \text{Col}(A) \Leftrightarrow A\mathbf{x} = \mathbf{b}$ is consistent

2. Null Space (Kernel)

Definition: $\text{Null}(A) = \{\mathbf{x} \in \mathbb{R}^n : A\mathbf{x} = \mathbf{0}\}$

  • The set of all solutions to the homogeneous equation $A\mathbf{x} = \mathbf{0}$
  • Represents inputs that are "lost in translation"
  • Lives in the input space $\mathbb{R}^n$

Finding Null Space:

  1. Solve $A\mathbf{x} = \mathbf{0}$ by row reduction
  2. Express solution in parametric vector form
  3. Basis vectors come from the parameters

3. Row Space

Definition: $\text{Row}(A) = \text{Col}(A^T)$

  • The span of row vectors of $A$
  • Lives in $\mathbb{R}^n$ (input space)

4. Left Null Space

Definition: $\text{Null}(A^T) = \{\mathbf{y} \in \mathbb{R}^m : A^T\mathbf{y} = \mathbf{0}\}$

  • Lives in $\mathbb{R}^m$ (output space)

Visual Understanding: The Four Subspaces

INPUT SPACE (ℝⁿ) →A→ OUTPUT SPACE (ℝᵐ) ┌─────────────────────┐ ┌─────────────────────┐ │ ROW SPACE │ │ COLUMN SPACE │ │ "Good Inputs" │─────────→ │ "Reachable Outputs" │ │ │ │ │ │ dim = rank(A) │ │ dim = rank(A) │ └─────────────────────┘ └─────────────────────┘ ⊥ ⊥ ┌─────────────────────┐ ┌─────────────────────┐ │ NULL SPACE │ A→ 0 → │ LEFT NULL SPACE │ │ "Lost in Translation"│─────────→ │ "Unreachable Zone" │ │ │ │ │ │ dim = nullity(A) │ │ dim = m - rank(A) │ └─────────────────────┘ └─────────────────────┘ Row Space ⊥ Null Space Column Space ⊥ Left Null Space

Linear Transformations

A function $T: V \to W$ is a linear transformation if:

  1. $T(\mathbf{u} + \mathbf{v}) = T(\mathbf{u}) + T(\mathbf{v})$ (preserves addition)
  2. $T(\alpha\mathbf{v}) = \alpha T(\mathbf{v})$ (preserves scalar multiplication)

Matrix Representation: Every linear transformation $T: \mathbb{R}^n \to \mathbb{R}^m$ can be represented as $T(\mathbf{x}) = A\mathbf{x}$ for some matrix $A$.

Key Concepts:

  • Kernel: $\ker(T) = \{\mathbf{v} \in V : T(\mathbf{v}) = \mathbf{0}\} = \text{Null}(A)$
  • Range: $\text{Range}(T) = \{T(\mathbf{v}) : \mathbf{v} \in V\} = \text{Col}(A)$
  • One-to-one: $T$ is injective $\Leftrightarrow \ker(T) = \{\mathbf{0}\} \Leftrightarrow \text{Null}(A) = \{\mathbf{0}\}$
  • Onto: $T$ is surjective $\Leftrightarrow \text{Range}(T) = W \Leftrightarrow \text{Col}(A) = \mathbb{R}^m$

Section 4.3: Linearly Independent Sets and Bases

Linear Independence

Vectors $\{\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_p\}$ are linearly independent if:

The only solution to $c_1\mathbf{v}_1 + c_2\mathbf{v}_2 + \cdots + c_p\mathbf{v}_p = \mathbf{0}$ is $c_1 = c_2 = \cdots = c_p = 0$

Equivalently, they are linearly dependent if:

$\exists$ non-trivial solution where at least one $c_i \neq 0$

Testing Linear Independence

Matrix Method:

  1. Form matrix $A = [\mathbf{v}_1 \; \mathbf{v}_2 \; \cdots \; \mathbf{v}_p]$
  2. Solve $A\mathbf{x} = \mathbf{0}$
  3. Independent $\Leftrightarrow$ only trivial solution $\Leftrightarrow$ all columns are pivot columns

Important Facts:

  • Any set containing the zero vector is linearly dependent
  • Two vectors are dependent $\Leftrightarrow$ one is a scalar multiple of the other
  • If $p > n$, then $p$ vectors in $\mathbb{R}^n$ must be linearly dependent

Basis

A basis for subspace $H$ is a linearly independent set that spans $H$.

Properties:

  • Every vector in $H$ has a unique representation as a linear combination of basis vectors
  • A basis is a minimal spanning set
  • A basis is a maximal independent set

Standard Basis for $\mathbb{R}^n$:

$$\mathbf{e}_1 = \begin{bmatrix} 1 \\ 0 \\ 0 \\ \vdots \\ 0 \end{bmatrix}, \quad \mathbf{e}_2 = \begin{bmatrix} 0 \\ 1 \\ 0 \\ \vdots \\ 0 \end{bmatrix}, \quad \ldots, \quad \mathbf{e}_n = \begin{bmatrix} 0 \\ 0 \\ 0 \\ \vdots \\ 1 \end{bmatrix}$$

Finding a Basis

For Column Space:

  1. Row reduce $A$ to echelon form
  2. Pivot columns in original $A$ form a basis for $\text{Col}(A)$

For Null Space:

  1. Solve $A\mathbf{x} = \mathbf{0}$ in parametric form
  2. Vectors multiplying the parameters form a basis for $\text{Null}(A)$

For Row Space:

  1. Row reduce $A$ to echelon form
  2. Non-zero rows of echelon form are a basis for $\text{Row}(A)$

Section 4.4: Coordinate Systems

Coordinates Relative to a Basis

Given basis $\mathcal{B} = \{\mathbf{b}_1, \mathbf{b}_2, \ldots, \mathbf{b}_n\}$ for $V$ and vector $\mathbf{x} \in V$:

$$\mathbf{x} = c_1\mathbf{b}_1 + c_2\mathbf{b}_2 + \cdots + c_n\mathbf{b}_n$$

$\mathcal{B}$-coordinates: $[\mathbf{x}]_{\mathcal{B}} = \begin{bmatrix} c_1 \\ c_2 \\ \vdots \\ c_n \end{bmatrix}$

Key Property: The coordinate mapping $\mathbf{x} \mapsto [\mathbf{x}]_{\mathcal{B}}$ is a one-to-one linear transformation.

Coordinate Transformation

The mapping establishes an isomorphism between $V$ and $\mathbb{R}^n$, meaning:

  • Operations in $V$ correspond to operations in $\mathbb{R}^n$
  • $[\mathbf{u} + \mathbf{v}]_{\mathcal{B}} = [\mathbf{u}]_{\mathcal{B}} + [\mathbf{v}]_{\mathcal{B}}$
  • $[\alpha\mathbf{u}]_{\mathcal{B}} = \alpha[\mathbf{u}]_{\mathcal{B}}$

Practical Use: Complex problems in abstract spaces can be solved using coordinates in $\mathbb{R}^n$.

Section 4.5: The Dimension of a Vector Space

Dimension

The dimension of vector space $V$, written $\dim(V)$, is:

The number of vectors in any basis for $V$

Fundamental Theorem: All bases for a vector space have the same number of vectors.

Important Dimensions

  • $\dim(\mathbb{R}^n) = n$
  • $\dim(\mathcal{P}_n) = n + 1$ (polynomials of degree $\leq n$)
  • $\dim(\mathbb{M}_{m \times n}) = mn$ ($m \times n$ matrices)
  • $\dim(\{\mathbf{0}\}) = 0$ (trivial space)

Rank-Nullity Theorem

For matrix $A$ ($m \times n$):

$$\text{rank}(A) + \text{nullity}(A) = n$$

Where:

  • $\text{rank}(A) = \dim(\text{Col}(A)) = \dim(\text{Row}(A))$ = number of pivot columns
  • $\text{nullity}(A) = \dim(\text{Null}(A))$ = number of free variables

Interpretation:

Total dimensions of input space = dimensions preserved + dimensions lost

Dimension Properties

For subspace $H$ of finite-dimensional $V$:

  1. $\dim(H) \leq \dim(V)$
  2. If $\dim(H) = \dim(V)$, then $H = V$
  3. Any linearly independent set of $\dim(V)$ vectors is a basis
  4. Any spanning set of $\dim(V)$ vectors is a basis

The Basis Theorem

In $n$-dimensional space $V$:

  • Spanning Set: $p$ vectors that span $V$ with $p > n$ can be reduced to a basis
  • Independent Set: $p$ linearly independent vectors with $p < n$ can be extended to a basis
  • Exactly $n$ vectors: If they span OR are independent, they form a basis

Section 4.6: Change of Basis

Motivation

Same vector, different coordinate systems:

  • Physics: different reference frames
  • Graphics: object coordinates vs. screen coordinates
  • Data analysis: original features vs. principal components

Change-of-Coordinates Matrix

Given two bases $\mathcal{B} = \{\mathbf{b}_1, \ldots, \mathbf{b}_n\}$ and $\mathcal{C} = \{\mathbf{c}_1, \ldots, \mathbf{c}_n\}$ for $V$:

Change-of-coordinates matrix from $\mathcal{B}$ to $\mathcal{C}$:

$\underset{\mathcal{C} \leftarrow \mathcal{B}}{P}$ converts $\mathcal{B}$-coordinates to $\mathcal{C}$-coordinates

$$[\mathbf{x}]_{\mathcal{C}} = \underset{\mathcal{C} \leftarrow \mathcal{B}}{P} \; [\mathbf{x}]_{\mathcal{B}}$$

Constructing the Change-of-Coordinates Matrix

Method:

  1. Express each basis vector of $\mathcal{B}$ in terms of basis $\mathcal{C}$
  2. The $\mathcal{C}$-coordinates of $\mathcal{B}$ vectors form the columns of $\underset{\mathcal{C} \leftarrow \mathcal{B}}{P}$
$$\underset{\mathcal{C} \leftarrow \mathcal{B}}{P} = \begin{bmatrix} [\mathbf{b}_1]_{\mathcal{C}} & [\mathbf{b}_2]_{\mathcal{C}} & \cdots & [\mathbf{b}_n]_{\mathcal{C}} \end{bmatrix}$$

Properties

  1. Invertibility: $\underset{\mathcal{C} \leftarrow \mathcal{B}}{P}$ is always invertible
  2. Inverse relationship: $\left(\underset{\mathcal{C} \leftarrow \mathcal{B}}{P}\right)^{-1} = \underset{\mathcal{B} \leftarrow \mathcal{C}}{P}$
  3. Composition: $\underset{\mathcal{C} \leftarrow \mathcal{D}}{P} = \underset{\mathcal{C} \leftarrow \mathcal{B}}{P} \cdot \underset{\mathcal{B} \leftarrow \mathcal{D}}{P}$

Special Case: Standard Basis

If $\mathcal{C}$ is the standard basis:

  • $\underset{\mathcal{C} \leftarrow \mathcal{B}}{P}$ simply has the $\mathcal{B}$ basis vectors as columns
  • $[\mathbf{x}]_{\mathcal{C}} = \underset{\mathcal{C} \leftarrow \mathcal{B}}{P} \; [\mathbf{x}]_{\mathcal{B}}$ means: $\mathbf{x} = \underset{\mathcal{C} \leftarrow \mathcal{B}}{P} \; [\mathbf{x}]_{\mathcal{B}}$

Visual Understanding

V (abstract vector space) •x (vector) / \ / \ [x]ᵦ = [·]ᵦ [·]꜀ = [x]꜀ / \ / \ ℝⁿ ----P---→ ℝⁿ (B-coords) C←B (C-coords)

The change-of-coordinates matrix allows us to "translate" between different coordinate systems while the vector itself remains unchanged in the abstract space.

Key Relationships and Summary

The Big Picture

Vector Space Structure: Vector Space V ↓ Subspaces H ↓ Spanning Sets ↓ Bases (minimal spanning, maximal independent) ↓ Dimension (number of basis vectors) ↓ Coordinates (representation relative to basis)

Essential Theorems

  1. Unique Representation Theorem: Each vector in $V$ has a unique representation as a linear combination of basis vectors.
  2. Basis Theorem: In $n$-dimensional space, any $n$ linearly independent vectors form a basis, and any $n$ vectors that span the space form a basis.
  3. Rank-Nullity Theorem: For $A \in \mathbb{R}^{m \times n}$, $\text{rank}(A) + \text{nullity}(A) = n$
  4. Dimension Theorem: All bases of a vector space have the same number of vectors.
  5. Fundamental Subspace Relations:
    • Row Space $\perp$ Null Space (orthogonal complements in $\mathbb{R}^n$)
    • Column Space $\perp$ Left Null Space (orthogonal complements in $\mathbb{R}^m$)

Computational Workflow

Finding Bases and Dimensions:

  1. For $\text{Col}(A)$: Row reduce, identify pivot columns, use original columns
  2. For $\text{Null}(A)$: Solve $A\mathbf{x} = \mathbf{0}$, parametric vectors form basis
  3. For $\text{Row}(A)$: Row reduce, non-zero rows form basis
  4. Dimension: Count basis vectors

Coordinate Conversions:

  1. Given basis $\mathcal{B}$ and vector $\mathbf{x}$
  2. Solve $[\mathbf{b}_1 \; \mathbf{b}_2 \; \cdots \; \mathbf{b}_n \mid \mathbf{x}]$ to find $[\mathbf{x}]_{\mathcal{B}}$
  3. For change of basis: construct $\underset{\mathcal{C} \leftarrow \mathcal{B}}{P}$ and multiply

Common Mistakes to Avoid

  1. Confusing span and linear independence: A set can span without being independent (redundant vectors), or be independent without spanning (insufficient vectors).
  2. Mixing up input and output spaces: Remember null space and row space live in $\mathbb{R}^n$ (input), while column space and left null space live in $\mathbb{R}^m$ (output).
  3. Using wrong matrix for basis: For $\text{Col}(A)$, use pivot columns from original $A$, not the row-reduced form.
  4. Forgetting uniqueness: Bases are not unique, but dimension is! Different bases can exist for the same space.
  5. Change of basis direction: $\underset{\mathcal{C} \leftarrow \mathcal{B}}{P}$ goes FROM $\mathcal{B}$ TO $\mathcal{C}$, not the other way around.

Study Tips

  1. Master the definitions: Vector space axioms, subspace test, linear independence, basis, dimension—these are the building blocks.
  2. Visualize in ℝ² and ℝ³: Even though concepts extend to ℝⁿ, geometric intuition from 2D and 3D is invaluable.
  3. Practice the algorithms: Finding bases, computing dimensions, and changing coordinates are computational skills that improve with practice.
  4. Understand the relationships: The four fundamental subspaces, rank-nullity theorem, and how bases relate to dimension form a coherent theory.
  5. Connect to applications: Think about how these abstract concepts apply to solving systems of equations, data compression, computer graphics, and differential equations.

Practice Problems for Self-Assessment

  1. Verify that a given set is a subspace using the subspace test
  2. Find bases and dimensions for the four fundamental subspaces of a given matrix
  3. Determine if a set of vectors is linearly independent
  4. Extend a linearly independent set to a basis
  5. Find coordinates of a vector relative to a non-standard basis
  6. Construct a change-of-coordinates matrix between two bases
  7. Use the rank-nullity theorem to find dimensions of related subspaces
  8. Prove that given vectors form a basis for a specified subspace

Final Thought

Vector spaces provide the language and framework for understanding linear algebra abstractly. The concepts in this chapter—subspaces, independence, bases, dimension, and coordinates—are fundamental to virtually all advanced mathematics, science, and engineering applications. Master these ideas, and you'll have a powerful toolkit for tackling complex problems across many domains.