- way of organizing data as vectors, matrices.
- perform computations on them.
- ordered list of numbers X = [2, 3]
- an arrow (geometrical)
- Dimensionality - 2D
- Length =
$\sqrt{2 ^ 2 + 3 ^ 2}$ = sqrt(13). Denoted as ||x|| - Unit vector = x / ||x||
- Zero vector = [0 0]
- Linear combination u = c1v1 + c2v2 + ... + cNvN (c1... are the weights)
- Span of a set of vectors = Set of all linear combinations of those vectors e.g. Span{x,y} = R2
- Linear Dependence - if one can be written as linear combination of others (if you can remove one and have the same span). Linearly Independent otherwise.
- 2D plane through 3D space != R2
- Basis - set of linearly independent vectors that spans => basis for that space. We can write any point in that space in terms of the basis. Standard basis for R2 [1, 0] & [0, 1]
- Dimensionality of the space - number of basis vectors
- Vector Space - set of vectors closed under linear combination. Any linear combination of any vectors in that space is also in that space.
-
Dot(inner) Product - x.y =
$\Sigma$ iNxiyi- Geometry of dot product
- w . r = ||w|| ||r|| cos
$\theta$ - DP is largest when vectors lie along same direction. Is 0 if vectors are perpendicular. Is lowest when angle is 180°
- For given lengths of vectors, DP is a measure of similarity of 2 vectors.
- w . r = ||w|| ||r|| cos
- Geometry of dot product
-
Represent system of linear equations.
-
Matrix vector multiplication - Wr = g where W is the matrix, r & g are vectors. Each component of g is dot product of corresponding row of matrix & the vector.
-
Inverse - r = W-1g.
-
W-1 @ W = I. (@ Matrix multiplication operator)
-
np.linalg.inv
- inverse using numpy -
Matrices transform the whole space - origin stays the same. Grind lines are straight. Grid lines remain parallel & evenly spaced.
-
All kinds of transformations - reflect around lines, expand or shrink horizontally, vertically or both, Rotations.
-
Range - all possible places transformation can "get to" (e.g. all of 2D space)
-
Rank - dimensionality of range (e.g. 2)
-
Null space - all possible places in space before transformation that result in origin after. Null space dimensionality greater than zero results in loss of dimensions in transformation.
-
When are matrices invertible?
- when you can always recover the vector before the transformation if you know where it lands after the transformation
- when dimensionality doesn't change.
- when matrix is square.
- when null space has dimensionality of zero and matrix is full rank.
- when you can always recover the vector before the transformation if you know where it lands after the transformation
-
Eigen Vectors - Linear transformation by matrix W doesn't change the direction of the vector (except flipping) they are Eigen vectors of matrix W.
-
Eigen Values tell you how much the Eigen vectors change length. Eigen values can be negative which causes Eigen vectors to flip.
-
Matrix * Eigen vector = Eigen value * Eigen Vector i.e. W * v =
$\lambda$ * v -
Why do we care?
- Identify transformations from Eigen vectors
- Study discrete dynamical systems
-
Eigen values can be complex. Complex Eigen values result in a specific type of dynamics - rotations.
- For a 3-neuron circuit,
- |
$\lambda$ | = 1, sustained rotation in 3D space - |
$\lambda$ | < 1, rotation towards origin - |
$\lambda$ | > 1, rotation towards +ve / -ve$\infty$
- |
- For a 3-neuron circuit,
-
Matrix Multiplication G = W @ R
- Grow i , col j = Wrow i (dot product) Rcol j
- # Wcols == # Rrows
- G = # Wrow , # Rcols