Skip to content

Instantly share code, notes, and snippets.

View sa-'s full-sized avatar

Samay Kapadia sa-

View GitHub Profile
@sa-
sa- / german_tutor.txt
Last active September 6, 2024 18:33
German tutor w/ explanations system prompt for Sonnet 3.5
Agieren Sie als Assistent für deutsche Konversationsübungen. Ihre Aufgabe ist es, das Gespräch in Gang zu halten.
Antworten Sie immer mit maximal 3-5 Sätzen. Seien Sie locker und informell.
Wenn ein Nutzer etwas auf Englisch sagt, übersetze es ins Deutsche und zeige es als Korrektur an.
Wenn der Benutzer bei jeder Antwort einen Fehler macht,
zeigen Sie die Korrekturen in einem Abschnitt namens "Korrekturen" an,
bevor Sie in einem Abschnitt namens "Antwort" antworten.
Hier ist ein Beispiel für die Struktur, wenn es Korrekturen gibt.
"""
**Korrekturen**
@value
struct State[T: AnyRegType]:
var _value: T
fn get(self) -> T:
return self._value
fn set(inout self, value: T):
self._value = value
@sa-
sa- / clone
Created May 7, 2024 11:12
Clone script to ~/code/[org]/[repo]
#!/bin/sh
if [ "$#" -ne 1 ]; then
echo "Usage: $0 [URL]"
exit 1
fi
url="$1"
repo_name=$(basename "$url" .git)
org_name=$(echo "$url" | sed -E 's/^.*[/:]([^/]+)\/[^/]+$/\1/')
.PHONY: dev
dev:
pip install -qU pip
poetry config virtualenvs.in-project true
poetry install --no-root
poetry run pre-commit install
poetry run pre-commit run -a
.PHONY: fmt
fmt:
@sa-
sa- / gh_issue_dep_chart.py
Created April 9, 2023 11:01
Create mermaid dependency chart of github issues
"""
initial prompt:
assuming github issues will have their first line in the body as "#dep: #1, #2, #4",
write a python script to get all open issues in a repo using the github rest API that
will parse this information and create a flowchart using mermaid.js to show the issue
dependenices. Also if there are no dependencies for an issue, add it to the flowchart
anyway. Remember to do a null check on the body before checking if a string exists
inside it. Use the requests library and not the github library
"""
@sa-
sa- / Dockerfile
Created October 5, 2021 08:52
Spylon + jupyter kernel gateway container
FROM python:3.8-slim
WORKDIR /tmp
ENV SPARK_HOME=/opt/spark-3.1.2-bin-hadoop3.2
# Spark, julia
RUN apt update \
&& apt-get install -y \
wget htop zsh git gnupg curl zip unzip vim cmake tmux sudo openjdk-11-jre \
&& wget https://dlcdn.apache.org/spark/spark-3.1.2/spark-3.1.2-bin-hadoop3.2.tgz \
@sa-
sa- / parallel_broadcast.jl
Last active June 23, 2022 16:11
Parallel broadcast in julia
function divide_range_into_max_n_chunks(r::UnitRange{Int64}, n::Int64)
range_length = r.stop - r.start
chunk_size = Int64(ceil(range_length / n))
return [i:min(i + chunk_size - 1, r.stop) for i in r.start:chunk_size:r.stop]
end
macro parallel_broadcast(col_a, broadcasted_operator, col_b)
return (:macrocall, Symbol("@threads"), :(#= none:1 =#), (:for, (:(=), :subrange, (:call, :divide_range_into_n_chunks, (:call, :(:), 1, :df_length), (:call, :nthreads))), (:block,
:(#= none:2 =#),
(:(=), (:ref, :result_col, :subrange), (:call, broadcasted_operator, (:call, :view, col_a, :subrange), (:call, :view, col_b, :subrange)))
@sa-
sa- / install_py37.sh
Last active January 31, 2020 12:56
Install python 3.7 on gcp ai notebook
# Install requirements
sudo apt-get install -y build-essential checkinstall libreadline-gplv2-dev libncursesw5-dev libssl-dev libsqlite3-dev tk-dev libgdbm-dev libc6-dev libbz2-dev zlib1g-dev openssl libffi-dev python3-dev python3-setuptools wget sudo apt-get install liblzma-dev libssl-dev libncurses5-dev libsqlite3-dev libreadline-dev libtk8.5 libgdm-dev libdb4o-cil-dev libpcap-dev
# Prepare to build
mkdir /tmp/Python37
cd /tmp/Python37
# Install
@sa-
sa- / pandas_ndcg.py
Created December 16, 2019 16:01
Compute ndcg@k from your dataset
# Compute ndcg @ k
def ndcg_at_k(predictions_df, k):
"""
This pandas dataframe should contain the columns "customer_id",
"estimate", and "label".
Where `estimate` is a recommendation score
that we can sort by descending order.