Skip to main content
Understanding Linear Algebra:
Data Science Edition
Randall Pruim
x
Search Results:
No results.
☰
Contents
Index
You!
Choose avatar
▻
✔️
You!
😺
👤
👽
🐶
🐼
🌈
Font family
▻
✔️
Open Sans
AaBbCc 123 PreTeXt
Roboto Serif
AaBbCc 123 PreTeXt
Adjust font
▻
Size
12
Smaller
Larger
Width
100
narrower
wider
Weight
400
thinner
heavier
Letter spacing
0
/200
closer
f a r t h e r
Word spacing
0
/50
smaller gap
larger gap
Line Spacing
135
/100
closer
together
further
apart
Light/dark mode
▻
✔️
default
pastel
twilight
dark
midnight
Reading ruler
▻
✔️
none
underline
L-underline
grey bar
light box
sunrise
sunrise underline
Motion by:
✔️
follow the mouse
up/down arrows - not yet
eye tracking - not yet
<
Prev
^
Up
Next
>
🔍
\(\newcommand{\avec}{{\boldsymbol a}} \newcommand{\bvec}{{\boldsymbol b}} \newcommand{\cvec}{{\boldsymbol c}} \newcommand{\dvec}{{\boldsymbol d}} \newcommand{\dtil}{\widetilde{\boldsymbol d}} \newcommand{\evec}{{\boldsymbol e}} \newcommand{\fvec}{{\boldsymbol f}} \newcommand{\mvec}{{\boldsymbol m}} \newcommand{\nvec}{{\boldsymbol n}} \newcommand{\pvec}{{\boldsymbol p}} \newcommand{\qvec}{{\boldsymbol q}} \newcommand{\rvec}{{\boldsymbol r}} \newcommand{\svec}{{\boldsymbol s}} \newcommand{\tvec}{{\boldsymbol t}} \newcommand{\uvec}{{\boldsymbol u}} \newcommand{\vvec}{{\boldsymbol v}} \newcommand{\wvec}{{\boldsymbol w}} \newcommand{\xvec}{{\boldsymbol x}} \newcommand{\yvec}{{\boldsymbol y}} \newcommand{\zvec}{{\boldsymbol z}} \newcommand{\betavec}{{\boldsymbol \beta}} \newcommand{\E}{\operatorname{E}} \newcommand{\zerovec}{{\boldsymbol 0}} \newcommand{\onevec}{{\boldsymbol 1}} \newcommand{\real}{{\mathbb R}} \newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]} \newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]} \newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]} \newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]} \newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]} \newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]} \newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]} \newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]} \newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]} \newcommand{\laspan}[1]{\text{Span}\{#1\}} \newcommand{\bcal}{{\cal B}} \newcommand{\ccal}{{\cal C}} \newcommand{\scal}{{\cal S}} \newcommand{\wcal}{{\cal W}} \newcommand{\ecal}{{\cal E}} \newcommand{\coords}[2]{\left\{#1\right\}_{#2}} \newcommand{\gray}[1]{\color{gray}{#1}} \newcommand{\lgray}[1]{\color{lightgray}{#1}} \newcommand{\rank}{\operatorname{rank}} \newcommand{\row}{\text{Row}} \newcommand{\col}{\text{Col}} \renewcommand{\row}{\text{Row}} \newcommand{\nul}{\text{Nul}} \newcommand{\var}{\text{Var}} \newcommand{\cov}{\text{Cov}} \newcommand{\corr}{\text{corr}} \newcommand{\len}[1]{\left|#1\right|} \newcommand{\bbar}{\overline{\bvec}} \newcommand{\bhat}{\widehat{\bvec}} \newcommand{\bperp}{\bvec^\perp} \newcommand{\atilde}{\tilde{\avec}} \newcommand{\btilde}{\tilde{\bvec}} \newcommand{\vhat}{\widehat{\vvec}} \newcommand{\uhat}{\widehat{\uvec}} \newcommand{\what}{\widehat{\wvec}} \newcommand{\xhat}{\widehat{\xvec}} \newcommand{\xmean}{\overline{\xvec}} \newcommand{\xbar}{\overline{\xvec}} \newcommand{\xtilde}{\tilde{\xvec}} \newcommand{\Xtilde}{\tilde{X}} \newcommand{\yhat}{\widehat{\yvec}} \newcommand{\ymean}{\overline{\yvec}} \newcommand{\ybar}{\overline{\yvec}} \newcommand{\ytilde}{\tilde{\yvec}} \newcommand{\yperp}{\yvec^\perp} \newcommand{\betahat}{\widehat{\betavec}} \newcommand{\Sighat}{\widehat{\Sigma}} \newcommand{\by}{\times} \newcommand{\transpose}{\top} \newcommand{\proj}[2]{\operatorname{proj}\left(#1 \to #2\right)} \newcommand{\projsub}[2]{\operatorname{proj}_{#2}(#1)} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \definecolor{fillinmathshade}{gray}{0.9} \newcommand{\fillinmath}[1]{\mathchoice{\colorbox{fillinmathshade}{$\displaystyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\textstyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\scriptstyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\scriptscriptstyle\phantom{\,#1\,}$}}} \)
Front Matter
Colophon
Our goals -- Preface to David Austin’s original edition
What’s different in the data science edition?
1
Scalars, Vectors and Matrices
1.1
Vectors
1.1.1
Three ways to think about vectors
1.1.2
Vector operations: scalar multiplication and vector addition.
1.1.2.1
Scalar Multiplication
1.1.2.2
Vector addition
1.1.2.3
Mathematical properties of vector operations
1.1.3
The (Euclidean) length of a vector
1.1.4
Summary
1.2
Vectors in Python
1.2.1
Introduction to Python
1.2.2
numpy
vectors
1.2.3
Vector length
1.2.4
Plotting vectors
1.3
Linear combinations of vectors
1.3.1
Summary
1.3.2
Exercises
1.4
Matrices
1.4.1
Matrices and their uses
1.4.2
Scalar multiplication and addition of matrices
1.4.3
Matrix-vector multiplication and linear combinations
1.4.4
Matrix-vector multiplication and linear systems
1.4.5
Matrices in Python
1.4.6
Matrix-matrix products
1.4.7
Some special types of matrices
1.4.8
Summary
1.4.9
Exercises
1.5
Tensors
1.5.1
Tensors in NumPy
1.5.2
Aggregation and Axes
1.5.3
Expanding an array
1.5.4
Broadcasting
1.5.5
Exercises
2
Systems of equations: Solving
\(A \xvec = \bvec\)
2.1
What can we expect
2.1.1
Some simple examples
2.1.2
Systems of linear equations
2.1.3
Summary
2.1.4
Exercises
2.2
Finding solutions to linear systems
2.2.1
Gaussian elimination
2.2.2
Augmented matrices
2.2.3
Reduced row echelon form
2.2.4
Solving matrix equations
2.2.5
Summary
2.2.6
Exercises
2.3
Computational Linear Algebra
2.3.1
Reduced row echelon form in Python
2.3.2
np.linalg.solve()
2.3.3
Computational effort
2.3.4
Summary
2.3.5
Exercises
2.4
Pivots and their relationship to solution spaces
2.4.1
The existence of solutions
2.4.2
The uniqueness of solutions
2.4.3
Summary
2.4.4
Exercises
3
Linear combinations and transformations
3.1
The span of a set of vectors
3.1.1
The span of a set of vectors
3.1.2
Pivot positions and span
3.1.3
Span and linear models
3.1.4
Summary
3.1.5
Exercises
3.2
Linear independence
3.2.1
Linear dependence
3.2.2
How to recognize linear dependence
3.2.3
Homogeneous equations
3.2.4
Summary
3.2.5
Exercises
3.3
Matrix transformations
3.3.1
Matrix transformations
3.3.2
Linear transformations
3.3.3
Composing matrix transformations
3.3.4
Discrete Dynamical Systems
3.3.5
Summary
3.3.6
Exercises
3.4
The geometry of matrix transformations
3.4.1
The geometry of
\(2\by2\)
matrix transformations
3.4.2
Matrix transformations and computer animation
3.4.3
Summary
3.4.4
Exercises
4
Invertibility, bases, and coordinate systems
4.1
Invertibility
4.1.1
Invertible matrices
4.1.2
Solving equations with an inverse
4.1.3
Finding inverses
4.1.4
Summary
4.1.5
Exercises
4.2
Triangular matrices and Gaussian elimination
4.2.1
Triangular matrices
4.2.2
Elementary matrices
4.2.3
Summary
4.2.4
Exercises
4.3
Bases and coordinate systems
4.3.1
Bases
4.3.2
Coordinate systems
4.3.3
Examples of bases
4.3.4
Summary
4.3.5
Exercises
4.4
Image compression
4.4.1
Color models
4.4.2
The JPEG compression algorithm
4.4.3
Summary
4.4.4
Exercises
4.5
Determinants
4.5.1
Determinants of
\(2\by2\)
matrices
4.5.2
Determinants of larger matrices
4.5.2.1
Determinants of elementary matrices
4.5.2.2
Using RREF to compute determinants
4.5.2.3
Cofactor expansions
4.5.3
Summary
4.5.4
Exercises
4.6
Subspaces
4.6.1
Subspaces
4.6.2
The column space of
\(A\)
4.6.3
The null space of
\(A\)
4.6.4
Summary
4.6.5
Exercises
4.7
Partial pivoting and LU factorizations
4.7.1
Partial pivoting
4.7.2
\(LU\)
factorizations
4.7.3
Summary
4.7.4
Exercises
5
Eigenvalues and eigenvectors
5.1
An introduction to eigenvalues and eigenvectors
5.1.1
A few examples
5.1.2
The usefulness of eigenvalues and eigenvectors
5.1.3
Summary
5.1.4
Exercises
5.2
Finding eigenvalues and eigenvectors
5.2.1
The characteristic polynomial
5.2.2
Finding eigenvectors
5.2.3
The characteristic polynomial and the dimension of eigenspaces
5.2.4
Using Python to find eigenvalues and eigenvectors
5.2.5
Summary
5.2.6
Exercises
5.3
Diagonalization, similarity, and powers of a matrix
5.3.1
Diagonalization of matrices
5.3.2
Powers of a diagonalizable matrix
5.3.3
Similarity and complex eigenvalues
5.3.4
Summary
5.3.5
Exercises
5.4
Dynamical systems
5.4.1
A first example
5.4.2
Classifying dynamical systems
5.4.3
A
\(3\by3\)
system
5.4.4
Summary
5.4.5
Exercises
5.5
Markov chains and Google’s PageRank algorithm
5.5.1
A first example
5.5.2
Markov chains
5.5.3
Google’s PageRank algorithm
5.5.4
Summary
5.5.5
Exercises
5.6
Finding eigenvectors numerically
5.6.1
The power method
5.6.2
Finding other eigenvalues
5.6.3
Summary
5.6.4
Exercises
6
Orthogonality and Least Squares
6.1
The dot product
6.1.1
Projections and dot products
6.1.2
Computing dot products
6.1.3
\(k\)
-means clustering
6.1.4
Summary
6.1.5
Exercises
6.2
Orthogonal complements and the matrix transpose
6.2.1
Orthogonal complements
6.2.2
The matrix transpose
6.2.3
Properties of the matrix transpose
6.2.4
Summary
6.2.5
Exercises
6.3
Orthogonal bases and projections
6.3.1
Orthogonal sets
6.3.2
Orthogonal projections
6.3.3
Summary
6.3.4
Exercises
6.4
Finding orthogonal bases
6.4.1
Gram-Schmidt orthogonalization
6.4.2
\(QR\)
factorizations
6.4.3
Summary
6.4.4
Exercises
6.5
Least squares methods
6.5.1
A first example
6.5.2
The linear model framework
6.5.3
Solving least squares problems
6.5.4
Using
\(QR\)
factorizations
6.5.5
Polynomial Regression
6.5.6
Fitting linear models with standard tools
6.5.7
Summary
6.5.8
Exercises
7
The Spectral Theorem and singular value decompositions
7.1
Sample statistics as linear algebra
7.1.1
Sample mean
7.1.2
Sample variance and covariance
7.1.3
Summary
7.1.4
Exercises
7.2
Symmetric matrices
7.2.1
Symmetric matrices and orthogonal diagonalization
7.2.2
Summary
7.2.3
Exercises
7.3
Quadratic forms
7.3.1
Quadratic forms
7.3.2
Definite symmetric matrices
7.3.3
Summary
7.3.4
Exercises
7.4
Principal Component Analysis
7.4.1
Themes of Principal Component Analysis
7.4.2
Using Principal Component Analysis
7.4.3
Summary
7.4.4
Exercises
7.5
Singular Value Decompositions
7.5.1
Finding singular value decompositions
7.5.2
The structure of singular value decompositions
7.5.3
Reduced singular value decompositions
7.5.4
Summary
7.5.5
Exercises
7.6
Using Singular Value Decompositions
7.6.1
Least squares problems
7.6.2
Rank
\(k\)
approximations
7.6.3
Principal component analysis
7.6.4
Image compressing and denoising
7.6.5
Analyzing Supreme Court cases
7.6.6
Summary
7.6.7
Exercises
Back Matter
A
Notation
B
Python Reference
B.1
Accessing Python
B.2
Packages and libraries for data science
B.3
Frequently used Python commands
Index
Colophon
Colophon
Colophon
This book was authored in PreTeXt.