Skip to content

Instantly share code, notes, and snippets.

View KMouratidis's full-sized avatar

Konstantinos Mouratidis KMouratidis

View GitHub Profile
@KMouratidis
KMouratidis / Other benchmarks.txt
Created October 22, 2025 18:13
Beelink GTR9Pro (AI Max+ 395) benchmarks
https://www.reddit.com/r/LocalLLaMA/comments/1mqtnz7/comment/n8uhbp3/
https://www.reddit.com/r/LocalLLaMA/comments/1mqtnz7/comment/n8wnxzc/
https://forum.level1techs.com/t/strix-halo-ryzen-ai-max-395-llm-benchmark-results/233796
https://github.com/lhl/strix-halo-testing/tree/main/llm-bench/gpt-oss-120b-F16
If you have your own, I'll be happy to include them here! If you want me to run stuff so you can add to your own dashboards,
happy to do that too.
@KMouratidis
KMouratidis / Godot-Funding.csv
Created October 2, 2024 19:40
"Godot's funding over time" code
Year Month Day Income Members Sponsors
2023 7 11 19146 255 6
2023 7 12 19846 298 6
2023 7 16 23060 400 8
2023 7 18 23138 407 8
2023 7 19 23300 418 8
2023 7 20 25841 421 9
2023 7 23 25952 432 9
2023 9 12 25007 404 9
2023 9 13 27701 579 9
print("Hello from GitHub!")
  1. Login as root (typically via SSH into the RPi).
  2. Get required packages
apt install -y libc6-dev zlib1g-dev libssl-dev libncurses5-dev \
  libsqlite3-dev libpcap-dev libjpeg-dev gcc make build-essential \
  libreadline-dev libtk8.5 libgdm-dev libdb4o-cil-dev libbz2-dev
  1. Get Python and move it to /usr/src (or wherever you like!):
@KMouratidis
KMouratidis / Spark_DietPi_Instructions.md
Created November 9, 2019 15:27
Rough list of steps to setup a Spark cluster on an array of RaspberryPis
  1. Set up the master as a hotspot:
    1. Go to dietpi-software and install the wifi hotspot
    2. Go to dietpi-config / network options: adapters and configure it
    3. Restart
  2. Connect nodes to master's network: Go to dietpi-config / network options and set SSID and KEY to your hotspot's name and password.
  3. Install openssh to all nodes (the master doesn't need it, also conflicts): apt-get install openssh-client
  4. Download Spark and put it in /home/dietpi (or change the settings below): wget https://www-eu.apache.org/dist/spark/spark-2.4.4/spark-2.4.4-bin-hadoop2.7.tgz
  5. Get it to the nodes (zipped to save bandwidth): scp /home/dietpi/spark-2.4.4-bin-hadoop2.7.tgz dietpi@192.168.42.10:/home/dietpi and scp /home/dietpi/spark-2.4.4-bin-hadoop2.7.tgz dietpi@192.168.42.11:/home/dietpi
  6. Get JDK (v8 because of spark) and Scala: sudo apt install -y openjdk-8-jdk-headless scala
  7. Unzip it on all computers: tar -zxvf spark-2.4.4-bin-hadoop2.7.tgz