Skip to content

Instantly share code, notes, and snippets.

@mkmohangb
Last active May 23, 2023 03:31
Show Gist options
  • Save mkmohangb/85f9af493e3433fcebbfdafc6f67a02f to your computer and use it in GitHub Desktop.
Save mkmohangb/85f9af493e3433fcebbfdafc6f67a02f to your computer and use it in GitHub Desktop.

Language of data

  • way of organizing data as vectors, matrices.
  • perform computations on them.

Vectors

  1. ordered list of numbers X = [2, 3]
  2. an arrow (geometrical)
  3. Dimensionality - 2D
  4. Length = $\sqrt{2 ^ 2 + 3 ^ 2}$ = sqrt(13). Denoted as ||x||
  5. Unit vector = x / ||x||
  6. Zero vector = [0 0]
  7. Linear combination u = c1v1 + c2v2 + ... + cNvN (c1... are the weights)
  8. Span of a set of vectors = Set of all linear combinations of those vectors e.g. Span{x,y} = R2
  9. Linear Dependence - if one can be written as linear combination of others (if you can remove one and have the same span). Linearly Independent otherwise.
  10. 2D plane through 3D space != R2
  11. Basis - set of linearly independent vectors that spans => basis for that space. We can write any point in that space in terms of the basis. Standard basis for R2 [1, 0] & [0, 1]
  12. Dimensionality of the space - number of basis vectors
  13. Vector Space - set of vectors closed under linear combination. Any linear combination of any vectors in that space is also in that space.
  14. Dot(inner) Product - x.y = $\Sigma$iNxiyi
    • Geometry of dot product
      • w . r = ||w|| ||r|| cos $\theta$
      • DP is largest when vectors lie along same direction. Is 0 if vectors are perpendicular. Is lowest when angle is 180°
      • For given lengths of vectors, DP is a measure of similarity of 2 vectors.

Matrices

  1. Represent system of linear equations.

  2. Matrix vector multiplication - Wr = g where W is the matrix, r & g are vectors. Each component of g is dot product of corresponding row of matrix & the vector.

  3. Inverse - r = W-1g.

  4. W-1 @ W = I. (@ Matrix multiplication operator)

  5. np.linalg.inv - inverse using numpy

  6. Matrices transform the whole space - origin stays the same. Grind lines are straight. Grid lines remain parallel & evenly spaced.

  7. All kinds of transformations - reflect around lines, expand or shrink horizontally, vertically or both, Rotations.

  8. Range - all possible places transformation can "get to" (e.g. all of 2D space)

  9. Rank - dimensionality of range (e.g. 2)

  10. Null space - all possible places in space before transformation that result in origin after. Null space dimensionality greater than zero results in loss of dimensions in transformation.

  11. When are matrices invertible?

    • when you can always recover the vector before the transformation if you know where it lands after the transformation
      1. when dimensionality doesn't change.
      2. when matrix is square.
      3. when null space has dimensionality of zero and matrix is full rank.
  12. Eigen Vectors - Linear transformation by matrix W doesn't change the direction of the vector (except flipping) they are Eigen vectors of matrix W.

  13. Eigen Values tell you how much the Eigen vectors change length. Eigen values can be negative which causes Eigen vectors to flip.

  14. Matrix * Eigen vector = Eigen value * Eigen Vector i.e. W * v = $\lambda$ * v

  15. Why do we care?

    • Identify transformations from Eigen vectors
    • Study discrete dynamical systems
  16. Eigen values can be complex. Complex Eigen values result in a specific type of dynamics - rotations.

    • For a 3-neuron circuit,
      • | $\lambda$ | = 1, sustained rotation in 3D space
      • | $\lambda$ | < 1, rotation towards origin
      • | $\lambda$ | > 1, rotation towards +ve / -ve $\infty$
  17. Matrix Multiplication G = W @ R

    • Grow i , col j = Wrow i (dot product) Rcol j
    • # Wcols == # Rrows
    • G = # Wrow , # Rcols
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment