Skip to content

Instantly share code, notes, and snippets.

Avatar
💭
🧘‍♂️🐻

Ahmed Fasih fasiha

💭
🧘‍♂️🐻
View GitHub Profile
@fasiha
fasiha / hi.txt
Created Jul 7, 2022
gist for storing attachments
View hi.txt
This is just a gist to let me attach images in comments for a URL.
@fasiha
fasiha / on-mim.md
Created Jun 25, 2022
A list of all onomatopoeic or mimetic word in JMDict (as of 2020-09-30)
View on-mim.md
  1. 「あっさり・アッサリ」| ① easily/readily/quickly/flatly (refuse) ( onomatopoeic or mimetic word) ② lightly (seasoned food, applied make-up, etc.)/plainly/simply ( onomatopoeic or mimetic word)
  2. 「あべこべ」| ① contrary/opposite/inverse/reverse/back-to-front ( onomatopoeic or mimetic word)
  3. 「あやふや」| ① uncertain/vague/ambiguous ( onomatopoeic or mimetic word)
  4. 「イジイジ・いじいじ」| ① hesitantly/timidly/diffidently ( onomatopoeic or mimetic word)
  5. 「いそいそ・イソイソ」| ① cheerfully/excitedly ( onomatopoeic or mimetic word)
  6. 「うじうじ・ウジウジ」| ① irresolute/hesitant ( onomatopoeic or mimetic word)
  7. 「うじゃうじゃ・ウジャウジャ」| ① in swarms/in clusters ( onomatopoeic or mimetic word) ② tediously/slowly ( onomatopoeic or mimetic word)
  8. 「うずうず・ウズウズ」| ① itching to do something/impatient/sorely tempted/eager ( onomatopoeic or mimetic word)
  9. 「うぞうぞ」| ① irrepressibly aroused (esp. sexually)/stimulated ( onomatopoeic or mimetic word)
  10. 「うだうだ・ウダウダ」| ① going on and on (about inconsequential things)/talking nonsense ( onomatopoeic or mimetic word) ②
@fasiha
fasiha / hav.py
Created Jun 17, 2022
Haversine and pseudo-Haversine (up to a constant) formula
View hav.py
import numpy as np
from numpy import sqrt, sin, cos
asin = np.arcsin
def simp(rlat1, rlon1, rlat2, rlon2):
dLat = rlat2 - rlat2
dLon = rlon2 - rlon1
return ((sin(dLat / 2)**2 + sin(dLon / 2)**2 * cos(rlat1) * cos(rlat2)))
View filteredLengthAtLeast.ts
function filteredLengthAtLeast<T>(
v: T[],
filter: (x: T) => boolean,
min: number,
) {
let nfound = 0;
for (let i = 0; i < v.length, nfound < min; i++) {
nfound += +filter(v[i]);
}
return nfound >= min;
View README.md

Recall:

  • 2.5% of the year corresponds to 9 days
  • 5% → 18 days
  • 10% → 37 days
  • 50% → 6 months
  • 90% → 47 weeks
  • 95% → 50 weeks
  • 97.5% → 51 weeks (a year has ~52 weeks)
View pygments-jupyterlab.css
.highlight .hll {
background-color: var(--jp-cell-editor-active-background)
}
.highlight {
background: var(--jp-cell-editor-background);
color: var(--jp-mirror-editor-variable-color)
}
View DavidOBowles-2020-10-05.md

@DavidOBowles

I’ll let you in on a secret. I have a doctorate in education, but the field’s basically just a 100 years old. We don’t really know what we’re doing. Our scholarly understanding of how learning happens is like astronomy 2000 years ago.

Most classroom practice is astrology.


Before the late 19th century, no human society had ever attempted to formally educate the entire populace. It was either aristocracy, meritocracy, or a blend. And always male.

@fasiha
fasiha / lexi_lambda.md
Created Oct 29, 2021
@lexi_lambda's Twitter thread on teaching programming using mainstream languages (I literally copy-pasted because Twitter doesn't offer an easy viewing/export view)
View lexi_lambda.md

https://mobile.twitter.com/lexi_lambda/status/1453866074106105864

§ Alexis King @lexi_lambda society if propositional and first-order logic were included in primary and secondary education mathematics curricula

§ Alexis King @lexi_lambda sometimes I wonder if this sort of ∀-vs-∃ confusion would be less pervasive if students were exposed to literally any formal logic in middle/high school, because first-order logic is very accessible, and it seems a lot more generally useful than, say, matmuls devoid of context

@fasiha
fasiha / bivariateMonteCarloPosterior.py
Created Aug 23, 2021
Verifying that Monte Carlo posterior moments on scalar model parameters extend to the bivariate (and multivariate) parameter case. Obvious in retrospect but I am VERY VERY BAD at math.
View bivariateMonteCarloPosterior.py
"""
We know how to use Monte Carlo to find moments of the posterior
```
P(parameter | data) ∝ P(data | parameter) * P(parameter)
```
That is, `posteriorlikelihood * prior`.
We generate parameter draws, and for each draw, assign a weight equal to
`P(data | parameters)`, since we have data. Then the weighted mean, weighted
variance, etc. give us the moments of the posterior.