Skip to content

Instantly share code, notes, and snippets.

View hamees-sayed's full-sized avatar
😼

Hamees Sayed hamees-sayed

😼
View GitHub Profile
from karel.stanfordkarel import *
"""
Karel should fill the whole world with beepers. Solution for Stanford Code in Place 2024 Section Leader Exercise
"""
def face_east():
"""
Karel will face east independent of what the current direction is.
"""
if facing_north():
@hamees-sayed
hamees-sayed / lecture_transcript.txt
Created December 15, 2023 07:38
lecture_transcript
Hello, everyone, and welcome to this course
on Modern Application Development.
So, what we are going to do now is to have a sort
of lightning overview of the topic of JavaScript.
So I want to be very clear about one thing. This
is not meant to be a tutorial on JavaScript. It
is not sufficient to really get you to the point
where you can write code in JavaScript. I will not
even be showing you how to run a JavaScript
@hamees-sayed
hamees-sayed / pca_notes.md
Created November 13, 2023 14:45
Notes for PCA

Principal Components Analysis

Principal Components Analysis (PCA), that tries to identify the subspace in which the data approximately lies. PCA is computationally efficient, it only requires an eigenvector calculation. Additionally, it is an unsupervised learning technique used for dimensionality reduction.

Imagine you have a dataset in three dimensions: height, weight, and age of a group of people. Each person in the dataset is a data point represented by a vector (height, weight, age). Now, PCA aims to find the directions in which the data varies the most.
In this case, let's say that height and weight have high variance compared to age. PCA would identify the directions (principal components) along which the data varies the most. These directions are orthogonal to each other.
Now, instead of representing each person in the original three-dimensional space, PCA allows you to project them onto a new subspace defined by these principal components. This new subspace retains the most important inf