Click On & Wait For Five Seconds Then Press
Vectors and spaces
Let's get our feet wet by thinking in terms of vectors and spaces.
Subscribe
Practice this topic
Vectors
We will begin our journey through linear algebra by defining and
conceptualizing what a vector is (rather than starting with matrices and
matrix operations like in a more basic algebra course) and defining
some basic operations (like addition, subtraction and scalar
multiplication).
Linear combinations and spans
Given a set of vectors, what other vectors can you create by
adding and/or subtracting scalar multiples of those vectors. The set of
vectors that you can create through these linear combinations of the
original set is called the "span" of the set.
Linear dependence and independence
If no vector in a set can be created from a linear combination of
the other vectors in the set, then we say that the set in linearly
independent. Linearly independent sets are great because there aren't
any extra, unnecessary vectors lying around in the set. :)
Subspaces and the basis for a subspace
In this tutorial, we'll define what a "subspace" is --essentially a
subset of vectors that has some special properties. We'll then think
of a set of vectors that can most efficiently be use to construct a
subspace which we will call a "basis".
Vector dot and cross products
In this tutorial, we define two ways to "multiply" vectors-- the
dot product and the cross product. As we progress, we'll get an
intuitive feel for their meaning, how they can used and how the two
vector products relate to each other.
- Vector Dot Product and Vector Length
- Proving Vector Dot Product Properties
- Proof of the Cauchy-Schwarz Inequality
- Vector Triangle Inequality
- Defining the angle between vectors
- Defining a plane in R3 with a point and normal vector
- Cross Product Introduction
- Proof: Relationship between cross product and sin of angle
- Dot and Cross Product Comparison/Intuition
- Vector Triple Product Expansion (very optional)
- Normal vector from plane equation
- Point distance to plane
- Distance Between Planes
Matrices for solving systems by elimination
This tutorial is a bit of an excursion back to you Algebra II days
when you first solved systems of equations (and possibly used matrices
to do so). In this tutorial, we did a bit deeper than you may have
then, with emphasis on valid row operations and getting a matrix into
reduced row echelon form.
Null space and column space
We will define matrix-vector multiplication and think about the
set of vectors that satisfy Ax=0 for a given matrix A (this is the null
space of A). We then proceed to think about the linear combinations of
the columns of a matrix (column space). Both of these ideas help us
think the possible solutions to the Matrix-vector equation Ax=b.
- Matrix Vector Products
- Introduction to the Null Space of a Matrix
- Null Space 2: Calculating the null space of a matrix
- Null Space 3: Relation to Linear Independence
- Column Space of a Matrix
- Null Space and Column Space Basis
- Visualizing a Column Space as a Plane in R3
- Proof: Any subspace basis has same number of elements
- Dimension of the Null Space or Nullity
- Dimension of the Column Space or Rank
- Showing relation between basis cols and pivot cols
- Showing that the candidate basis does span C(A)
Click On & Wait For Five Seconds Then Press
Matrix transformations
Understanding how we can map one set of vectors to another set. Matrices used to define linear transformations.
Subscribe
Functions and linear transformations
People have been telling you forever that linear algebra and
matrices are useful for modeling, simulations and computer graphics, but
it has been a little non-obvious. This tutorial will start to draw the
lines by re-introducing you functions (a bit more rigor than you may
remember from high school) and linear functions/transformations in
particular.
- A more formal understanding of functions
- Vector Transformations
- Linear Transformations
- Matrix Vector Products as Linear Transformations
- Linear Transformations as Matrix Vector Products
- Image of a subset under a transformation
- im(T): Image of a Transformation
- Preimage of a set
- Preimage and Kernel Example
- Sums and Scalar Multiples of Linear Transformations
- More on Matrix Addition and Scalar Multiplication
Linear transformation examples
In this tutorial, we do several examples of actually constructing
transformation matrices. Very useful if you've got some actual
transforming to do (especially scaling, rotating and projecting) ;)
Transformations and matrix multiplication
You probably remember how to multiply matrices from high school,
but didn't know why or what it represented. This tutorial will address
this. You'll see that multiplying two matrices can be view as the
composition of linear transformations.
Inverse functions and transformations
You can use a transformation/function to map from one set to
another, but can you invert it? In other words, is there a
function/transformation that given the output of the original mapping,
can output the original input (this is much clearer with diagrams).
This tutorial addresses this question in a linear algebra context.
Since matrices can represent linear transformations, we're going to
spend a lot of time thinking about matrices that represent the inverse
transformation.
- Introduction to the inverse of a function
- Proof: Invertibility implies a unique solution to f(x)=y
- Surjective (onto) and Injective (one-to-one) functions
- Relating invertibility to being onto and one-to-one
- Determining whether a transformation is onto
- Exploring the solution set of Ax=b
- Matrix condition for one-to-one trans
- Simplifying conditions for invertibility
- Showing that Inverses are Linear
Finding inverses and determinants
We've talked a lot about inverse transformations abstractly in the
last tutorial. Now, we're ready to actually compute inverses. We
start from "documenting" the row operations to get a matrix into reduced
row echelon form and use this to come up with the formula for the
inverse of a 2x2 matrix. After this we define a determinant for 2x2,
3x3 and nxn matrices.
More determinant depth
In the last tutorial on matrix inverses, we first defined what a
determinant is and gave several examples of computing them. In this
tutorial we go deeper. We will explore what happens to the determinant
under several circumstances and conceptualize it in several ways.
Transpose of a matrix
We now explore what happens when you switch the rows and columns of a matrix!
- Transpose of a Matrix
- Determinant of Transpose
- Transpose of a Matrix Product
- Transposes of sums and inverses
- Transpose of a Vector
- Rowspace and Left Nullspace
- Visualizations of Left Nullspace and Rowspace
- Rank(A) = Rank(transpose of A)
- Showing that A-transpose x A is invertible
Click On & Wait For Five Seconds Then Press
Alternate coordinate systems (bases)
We explore creating and moving between various coordinate systems.
Subscribe
Orthogonal complements
We will know explore the set of vectors that is orthogonal to
every vector in a second set (this is the second set's orthogonal
complement).
Orthogonal projections
This is one of those tutorials that bring many ideas we've been
building together into something applicable. Orthogonal projections
(which can sometimes be conceptualized as a "vector's shadow" on a
subspace if the light source is above it) can be used in fields varying
from computer graphics and statistics!
If you're familiar with orthogonal complements, then you're ready for
this tutorial!
- Projections onto Subspaces
- Visualizing a projection onto a plane
- A Projection onto a Subspace is a Linear Transforma
- Subspace Projection Matrix Example
- Another Example of a Projection Matrix
- Projection is closest vector in subspace
- Least Squares Approximation
- Least Squares Examples
- Another Least Squares Example
Change of basis
Finding a coordinate system boring. Even worse, does it make
certain transformations difficult (especially transformations that you
have to do over and over and over again)? Well, we have the tool for
you: change your coordinate system to one that you like more. Sound
strange? Watch this tutorial and it will be less so. Have fun!
- Coordinates with Respect to a Basis
- Change of Basis Matrix
- Invertible Change of Basis Matrix
- Transformation Matrix with Respect to a Basis
- Alternate Basis Transformation Matrix Example
- Alternate Basis Transformation Matrix Example Part 2
- Changing coordinate systems to help find a transformation matrix
Orthonormal bases and the Gram-Schmidt Process
As we'll see in this tutorial, it is hard not to love a basis
where all the vectors are orthogonal to each other and each have length 1
(hey, this sounds pretty much like some coordinate systems you've known
for a long time!). We explore these orthonormal bases in some depth
and also give you a great tool for creating them: the Gram-Schmidt
Process (which would also be a great name for a band).
- Introduction to Orthonormal Bases
- Coordinates with respect to orthonormal bases
- Projections onto subspaces with orthonormal bases
- Finding projection onto subspace with orthonormal basis example
- Example using orthogonal change-of-basis matrix to find transformation matrix
- Orthogonal matrices preserve angles and lengths
- The Gram-Schmidt Process
- Gram-Schmidt Process Example
- Gram-Schmidt example with 3 basis vectors
Eigen-everything
Eigenvectors, eigenvalues, eigenspaces! We will not stop with the
"eigens"! Seriously though, eigen-everythings have many applications
including finding "good" bases for a transformation (yes, "good" is a
technical term in this context).
- Introduction to Eigenvalues and Eigenvectors
- Proof of formula for determining Eigenvalues
- Example solving for the eigenvalues of a 2x2 matrix
- Finding Eigenvectors and Eigenspaces example
- Eigenvalues of a 3x3 matrix
- Eigenvectors and Eigenspaces for a 3x3 matrix
- Showing that an eigenbasis makes for good coordinate systems
No comments:
Post a Comment