Thursday, 25 April 2013

Linear algebra




Click On & Wait For Five Seconds Then Press


Vectors and spaces
Let's get our feet wet by thinking in terms of vectors and spaces.
Subscribe
Practice this topic

Vectors

We will begin our journey through linear algebra by defining and conceptualizing what a vector is (rather than starting with matrices and matrix operations like in a more basic algebra course) and defining some basic operations (like addition, subtraction and scalar multiplication).

Linear combinations and spans

Given a set of vectors, what other vectors can you create by adding and/or subtracting scalar multiples of those vectors. The set of vectors that you can create through these linear combinations of the original set is called the "span" of the set.

Linear dependence and independence

If no vector in a set can be created from a linear combination of the other vectors in the set, then we say that the set in linearly independent. Linearly independent sets are great because there aren't any extra, unnecessary vectors lying around in the set. :)

Subspaces and the basis for a subspace

In this tutorial, we'll define what a "subspace" is --essentially a subset of vectors that has some special properties. We'll then think of a set of vectors that can most efficiently be use to construct a subspace which we will call a "basis".

Matrices for solving systems by elimination

This tutorial is a bit of an excursion back to you Algebra II days when you first solved systems of equations (and possibly used matrices to do so). In this tutorial, we did a bit deeper than you may have then, with emphasis on valid row operations and getting a matrix into reduced row echelon form.

Null space and column space

We will define matrix-vector multiplication and think about the set of vectors that satisfy Ax=0 for a given matrix A (this is the null space of A). We then proceed to think about the linear combinations of the columns of a matrix (column space). Both of these ideas help us think the possible solutions to the Matrix-vector equation Ax=b.





Click On & Wait For Five Seconds Then Press

Matrix transformations
Understanding how we can map one set of vectors to another set. Matrices used to define linear transformations.
Subscribe

Functions and linear transformations

People have been telling you forever that linear algebra and matrices are useful for modeling, simulations and computer graphics, but it has been a little non-obvious. This tutorial will start to draw the lines by re-introducing you functions (a bit more rigor than you may remember from high school) and linear functions/transformations in particular.

Linear transformation examples

In this tutorial, we do several examples of actually constructing transformation matrices. Very useful if you've got some actual transforming to do (especially scaling, rotating and projecting) ;)

Transformations and matrix multiplication

You probably remember how to multiply matrices from high school, but didn't know why or what it represented. This tutorial will address this. You'll see that multiplying two matrices can be view as the composition of linear transformations.

Inverse functions and transformations

You can use a transformation/function to map from one set to another, but can you invert it? In other words, is there a function/transformation that given the output of the original mapping, can output the original input (this is much clearer with diagrams). This tutorial addresses this question in a linear algebra context. Since matrices can represent linear transformations, we're going to spend a lot of time thinking about matrices that represent the inverse transformation.

Finding inverses and determinants

We've talked a lot about inverse transformations abstractly in the last tutorial. Now, we're ready to actually compute inverses. We start from "documenting" the row operations to get a matrix into reduced row echelon form and use this to come up with the formula for the inverse of a 2x2 matrix. After this we define a determinant for 2x2, 3x3 and nxn matrices.

More determinant depth

In the last tutorial on matrix inverses, we first defined what a determinant is and gave several examples of computing them. In this tutorial we go deeper. We will explore what happens to the determinant under several circumstances and conceptualize it in several ways.

Transpose of a matrix

We now explore what happens when you switch the rows and columns of a matrix!



Click On & Wait For Five Seconds Then Press




Alternate coordinate systems (bases)
We explore creating and moving between various coordinate systems.
Subscribe

Orthogonal projections

This is one of those tutorials that bring many ideas we've been building together into something applicable. Orthogonal projections (which can sometimes be conceptualized as a "vector's shadow" on a subspace if the light source is above it) can be used in fields varying from computer graphics and statistics! If you're familiar with orthogonal complements, then you're ready for this tutorial!

Change of basis

Finding a coordinate system boring. Even worse, does it make certain transformations difficult (especially transformations that you have to do over and over and over again)? Well, we have the tool for you: change your coordinate system to one that you like more. Sound strange? Watch this tutorial and it will be less so. Have fun!

Orthonormal bases and the Gram-Schmidt Process

As we'll see in this tutorial, it is hard not to love a basis where all the vectors are orthogonal to each other and each have length 1 (hey, this sounds pretty much like some coordinate systems you've known for a long time!). We explore these orthonormal bases in some depth and also give you a great tool for creating them: the Gram-Schmidt Process (which would also be a great name for a band).

Eigen-everything

Eigenvectors, eigenvalues, eigenspaces! We will not stop with the "eigens"! Seriously though, eigen-everythings have many applications including finding "good" bases for a transformation (yes, "good" is a technical term in this context).









No comments:

Post a Comment