Chapter 1: Linear Equations in Linear Algebra

Complete Study Guide & Review Notes

Chapter Overview

Systems of Linear Equations
Row Reduction
Vector Equations
Matrix Equations
Solution Sets
Linear Independence
Linear Transformations

1.1 Systems of Linear Equations

Linear Equation: a₁x₁ + a₂x₂ + ... + aₙxₙ = b

Key Classifications:

  • Consistent: Has at least one solution
  • Inconsistent: Has no solution
  • Solution types: Unique, None, or Infinitely Many

Augmented Matrix: [A | b] represents the system efficiently for solving.

1.2 Row Reduction & Echelon Forms

Elementary Row Operations:

  • Replace row with sum of itself and multiple of another
  • Interchange two rows
  • Multiply row by nonzero constant

REF: Row Echelon Form - staircase pattern with pivots

RREF: Reduced Row Echelon Form - pivots are 1, zeros above and below

1.3 Vector Equations

Vector in ℝⁿ: v = [v₁, v₂, ..., vₙ]ᵀ
Linear Combination: c₁v₁ + c₂v₂ + ... + cₚvₚ

Span{v₁, v₂, ..., vₚ}: Set of all possible linear combinations

Vector Equation: x₁a₁ + x₂a₂ + ... + xₙaₙ = b

1.4 Matrix Equation Ax = b

Ax = x₁a₁ + x₂a₂ + ... + xₙaₙ

Equivalent Statements:

  • Ax = b has a solution
  • b is a linear combination of columns of A
  • b ∈ Span{columns of A}

Column Interpretation: Ax is a linear combination of A's columns with weights x.

1.5 Solution Sets of Linear Systems

Parametric Vector Form: x = p + t₁v₁ + t₂v₂ + ... + tₖvₖ

Solution Structure:

  • Homogeneous: Ax = 0 (always consistent)
  • General: x = xₚ + xₕ (particular + homogeneous)
  • Null Space: Solution set of Ax = 0

1.7 Linear Independence

c₁v₁ + c₂v₂ + ... + cₚvₚ = 0 has only trivial solution

Linear Independence Test:

Vectors are linearly independent ⟺ matrix has pivot in every column ⟺ no free variables in Ax = 0

Key Facts:

  • Set with zero vector is dependent
  • More vectors than entries per vector → dependent
  • One vector spans another → dependent

1.8 Linear Transformations

T: ℝⁿ → ℝᵐ is linear if: 1) T(u + v) = T(u) + T(v) 2) T(cu) = cT(u)

Properties:

  • T(0) = 0
  • T preserves linear combinations
  • Domain: ℝⁿ, Codomain: ℝᵐ

Examples: Rotations, reflections, projections, scaling

1.9 Matrix of a Linear Transformation

Standard Matrix: [T] = [T(e₁) T(e₂) ... T(eₙ)]

Fundamental Theorem:

Every linear transformation T: ℝⁿ → ℝᵐ can be written as T(x) = Ax for a unique m×n matrix A.

Construction: Columns of A are images of standard basis vectors.

Composition: (S ∘ T)(x) = S(T(x)) corresponds to matrix multiplication.

Essential Theorems & Key Takeaways

🎯 Existence & Uniqueness

A linear system is consistent if and only if the augmented matrix and coefficient matrix have the same number of pivot columns.

Solution types: Unique (n pivots), None (inconsistent), Infinitely many (free variables)

🔗 Span & Consistency

b ∈ Span{A} ⟺ the system Ax = b is consistent

The columns of A span ℝᵐ ⟺ every b has a solution ⟺ A has pivot in every row

⚖️ Linear Independence

Columns of A are linearly independent ⟺ Ax = 0 has only trivial solution ⟺ A has pivot in every column

Geometric meaning: Vectors don't lie in span of others

🔄 Linear Transformations

Every linear transformation from ℝⁿ to ℝᵐ is matrix multiplication by a unique m×n matrix.

Standard matrix: Columns are T(eᵢ) where eᵢ are standard basis vectors

Quick Reference Formulas

Matrix-Vector Product:
Ax = x₁a₁ + x₂a₂ + ... + xₙaₙ
Parametric Solution:
x = xₚ + t₁v₁ + ... + tₖvₖ
Linear Independence:
c₁v₁ + ... + cₚvₚ = 0 ⟹ all cᵢ = 0
Standard Matrix:
[T] = [T(e₁) | T(e₂) | ... | T(eₙ)]

🚀 Study Strategy

Master row reduction first → understand vector equations → connect to matrix equations → explore solution structures → study linear independence → apply to transformations