VM270 6350
Chapter 1: Vector Spaces
Complex Numbers
Definition: C ={a+bi: a,b c R}
Properties
Commutativity
Associativity
Identities
Additive Inverse
Multiplicative Inverse
Distributative
Subtraction
z + (-z) = 0 for z c C
w - z = w + (-z) for w,z c C
Division
z * (1/z) = 1 for z c C and z not equal to 0.
w/z = w * (1/z) for w,z c C with z not equal to 0.
Vector Spaces
Definition
A set V with addition and scalar multiplication functions in V that allow the listed properties to hold.
Properties
Must have a addition function and a scalar multiplication function so the other properties may follow:
Commutativity
u + v = v+ u for all u,v c V
Associativity
(u + v) + w = u + (v + w) for all u,v,w c V
(ab)v = a(bv) for all a,b c F and all v c V
Additive Identity
There exists an element 0 c V such that v + 0 = v for all v c V
Additive Inverse
For every v c V, there exists w c V such that v + w = 0.
Multiplicative Inverse
1v = v for all v c V
Distributative Properties
a (u + v) = au + av for all a c F and all u,v c V
(a + b)u = au + bu for all a,b c F and all u c V
Sum
A Sum is a linear combination of a vector in V using vectors in subspaces of V.
Notation/Example:
V = U + W when U, W c V.
Direct Sum
When every vector in V can be represented as a unique linear combination of subspaces U and W, it is said that V is a Direct Sum of U and W.
Theorem
If U and W c V, then V is a direct sum of U and W if and only if V = U + W and U Ո W = {0}
Chapter 2:
Finite Dimensional
Vector Spaces
Linear Combination
Def: Composed of a list (v1, ..., v
Span
Linear Dependence and Independence
Linear Independence
Linear Dependence
Bases
Dimension
Invertible
A linear map T c L(V,W) is invertible if there is a linear map S c L(W,V) such that ST = I c V and TS = I c W.
Isomorphic
Two vector spaces are isomorphic if there is an invertible linear map from one vector space onto the other one.
Vocabulary
Injective
Def: One to one
For T: V -> W, T is injective whenever u, v c V and Tu = Tv, we get u = v.
Subjective
Def: Onto
For T: V -> W, T is surjective if its range equals W.
Invariant
A subspace that gets mapped onto itself by an operator is said to be invariant.
Operator
Def: A linear map function from a vector space to itself.
Union (U)
The set of elements in set A or set B.
Remember, it's the stores on street A and B, including the KFC on the corner.
Intersection (Ո)
The set of elements in set A and set B.
Remember, it's the KFC.
Chapter 3: Linear Maps
Linear Map
Properties of a Linear Map
Additivity
T(u + v) = Tu + Tv
Homogeneity
T(av) = a(Tv)
Def: A function that "maps" a vector from one vector space onto another vector space
Ex: T: V -> W maps a vector in V to W.
Types of Linear Maps
Zero Function
0v = 0
Identity Map
Iv = v
Differentiation
Integration
Multiplication by x^2
Backward shift
From F^n to F^m
Null Space
The subset of V consisting of the vectors that T maps to 0.
Proposition: If T c L(V,W), then null T is a subspace of V
Range
A.k.a. Image
Matrix
Chapter 7: Operators on Inner-Product Spaces
Self-Adjoint and Normal Operators
Self-Adjoint
For T c L(V), T = T*
Normal
Definition
TT* = T*T
In other words, the operator commutes with its adjoint.
All self-adjoint operators are also normal.
Prop: T is normal if and only if
||Tv|| = ||T*v||
The Spectral Theorem
What is it?
This theorem characterizes operators on V, for which there is an orthonormal basis of V with respect to the operator's diagonal matrix, as either a normal or self-adjoint operator, depending on whether F=C or F=R.
Components
Complex Spectral Theorem
Supposing V is a complex inner-product space and T c L(V), V has an orthonormal basis consisting of eigenvectors of T if and only if T is normal.
Real Spectral Theorem
Lemmas
1
Suppose T c L(V) is self-adjoint. If a¸b cR are such that
(a)^2 < 4b, then
T^2 + aT + bI
is invertible.
2
Suppose T c L(V) is self-adjoint. Then T has an eigenvalue.
Important mostly for Real vector spaces
Supposing V is a real inner-product space and T c L(V), V has an orthonormal basis consisting of eigenvectors of T if and only if T is self-adjoint.
Overall Result
Whether F=R or F=C, every self-adjoint operator on V has a diagonal matrix with respect to some orthonormal basis.
Isometry
Def: An operator is an isometry if it preserves norms.
For S c L(V), ||Sv|| = ||v||
Chapter 6: Inner-Product Spaces
What is an Inner-Product?
An Inner Product basically measures how much u points in the direction of v when u,v C V. It takes two vectors and produces a number.
Formal Definition
A function that takes every order pair (u, v) c V to the number <u,v> c F.
Properties
Positivity
<v, v> >= 0 for all v c V
Definiteness
<v , v> = 0 if and only if v=0
Additivity in first slot
Homogeneity in first slot
Conjugate Symmetry
Inner-Product Space
Def: A vector space V along with an inner product on V
Euclidean Inner Product
Definitions
Orthogonal
<u,v> = 0.
Basically, this is when u and v are perpendicular to each other.
Orthogonal complement of U
The set of all vectors in V that are orthogonal to every vector in U. Denoted as U^(Perpendicular symbol)
Adjoint
A function/linear map
For T c L(V,W), <Tv, w> = <v, T*w>
Norm
The length of a vector
Orthogonal Projection
Linear Functional
Conjugate transpose
Theorems
Pythagorean Theorem
||u + v||^2 = ||u||^2 + ||v||^2
Cauchy-Schwarz Inequality
If u,v c V, then |<u,v>| =< ||u|| ||v||
Triangle Inequality
||u + v|| <= ||u|| + ||v||
Parallelogram Equality
||u+v||^2 + ||u-v||^2 = 2 * (||u||^2 + ||v||^2)
This is one of the easiest theorems that you realize you need to use during a proof/problem. If it uses u+v or u-v, there's a good chance you need to use this equality.
The Gram-Schmidt procedure
This procedure essentialy allows you to transform a linearly independent set of vectors (v1,...,vm) into an orthonormal set (e1,...,em) of vectors in V, such that span (v1,...,vj) = span (e1, ..., ej) for j= 1, ..., m.
If U is a subspace of V, then
V equals the direct sum of U and the orthogonal complement of U.
Orthogonal Decomposition
u = av + w.
Set w = u - av
Take into account that <av, u - av> = 0 because w is supposed to be orthogonal to av.
Find a value for a in terms of u and v.
Substitute the value into u = av + (u - av)
This gives you an Orthogonal Decomposition.
Note: Originally wrote much more than this, but Mindomo deleted the branch and couldn't redo it so this is sufficient for now.
NOTE FOR PROF BARAKAT
I'm sorry if this mindmap is messy. After a while I grew tired of constantly switching between underscores and italics and all this would quickly aggravate my Carpal Tunnel, so at some point some of this typing is mostly in a way that is not necessarily the cleanest way of typing things, but the fastest and easy enough for me to understand. After all, these maps were mainly for our own benefit in grasping the subject, weren't they?
Also, the boundaries were included to highlight what I believed to be the most fundamental or difficult topics we've gone over so I could quickly find them.
Sorry for any inconvenience
Chapter 5: Eigenvalues and Eigenvectors
Recall: An Operator is a linear map function from a vector space to itself.
Tu = λu
The scalar λ c F is an eigenvalue of T c L(V) when u is a non-zero vector.
The non-zero vector u c V is called the eigenvector of T (corresponding to λ).
null (T -λI) = The set of eigenvectors of T corresponding to λ.
Matrices
Theorem: Every operator on a finite-dimensional, nonzero, complex vector space has an eigenvalue.
Upper Triangular
Theorem: Suppose V is a complex vector space and T c L(V). Then T has an upper-triangular matrix with respect ot some basis of V.
Corollary from CH. 6: If the above theorem is true, then T has an upper-triangular matrix with respect to some orthonormal basis of V.
2nd Corollary from Ch. 6: Supose V is a complex vector space and T c L(V). Then T has an upper-triangular matrix with respect to some orthonormal basis of V.
This corollary is essentially "cutting out the fat" of the previous two.
Diagonal
Matrix with non-zero elements only along the diagonal of the matrix and zeroes everywhere else