Skip to content

Instantly share code, notes, and snippets.

View Fly-Style's full-sized avatar
🇺🇦

Sasha Syrotenko Fly-Style

🇺🇦
View GitHub Profile
@Fly-Style
Fly-Style / llm-wiki.md
Created April 14, 2026 08:15 — forked from karpathy/llm-wiki.md
llm-wiki

LLM Wiki

A pattern for building personal knowledge bases using LLMs.

This is an idea file, it is designed to be copy pasted to your own LLM Agent (e.g. OpenAI Codex, Claude Code, OpenCode / Pi, or etc.). Its goal is to communicate the high level idea, but your agent will build out the specifics in collaboration with you.

The core idea

Most people's experience with LLMs and documents looks like RAG: you upload a collection of files, the LLM retrieves relevant chunks at query time, and generates an answer. This works, but the LLM is rediscovering knowledge from scratch on every question. There's no accumulation. Ask a subtle question that requires synthesizing five documents, and the LLM has to find and piece together the relevant fragments every time. Nothing is built up. NotebookLM, ChatGPT file uploads, and most RAG systems work this way.

eNrtXetzm0i2_zz6KyhX7f0SJaafNLnJ3vLbnrETj-XEO_sl1UBjESPQALLj2dr__Z5ukAQyyNhyUnNnb6bKI8E5p0-f_nEe_UDv_ufbJLZuVZZHafJ-C72xtyyV-GkQJdfvtz5dHr4WW__z98G7c1mMP4a7syjWd_4--Omd-WzF6lbF77cE3bIKmV2r4vNcFPkCoqYyKcYqTc7k1zQ7SoP3Wx_SRNWuR0njuj-WmfQLlZ1quTuzIj1LA_V-K5RxDrcnMkpGqX-jiqMsnU1B3y3rNlJ3JdHlxcHBluXJJIiKhcBY5vkHOYHbIx_02rJk7qsk2Fte3zEXQKEt6NZP785jea-yUSELK4c_cB-sI6_VvpzAX2hPxjPgwjZ27DfCoZTYCLOt7bXMu7MsL54nYTRVKlgwIfKmkxKMqU7lPYzPZTRZtmO_wY1_69gvx1l6t5fOwBhzdtJFf56pgzBUfhHdqr0sKvbGMvGXzSLb7uJsoXbJOuKzWVxE0zhS2VKtN67TxXL8BPGXaSHj_fPRclgcwhl_I2wqbM7d9Xzp0k6dvb2KivFuDIP4jFY078l1EhXqmcznaZSnyQb9q7N2D-gsjgF2dVpiU5u_IQhxypAQXZwXKlfZrSyippLdLaUTD3AaPGiKvHmsqTOZyL00X44YX0d5rjJwC0WDwX6EYaT8FFxLnQU7PRppZ-1s7TQKVX_KJ3WkYniqNs_rx8GoL92TBT9PoQtwz_0oR-ks7klZLH0Worj7Ofi9J-W--rZEF1knsE6JaOdzcVJz9hijNQLrlGsetH11mxYm1j5mHONkDo7Pl1KZw_AbLhxBGe58Ps_H93nky_hMfosmswm4-0t5o5btMZuxbrhej4sEnFUXM3Jc2tnyYZSpZzHupXHwPMaxTPMuTiww6fYvUXIMudCO788gp7qvuSTqrHuYl37VZXhNVuK_1dQnib-MsmKd3E9JZpx9LZtZ14JmuYCHVydQXqx68yybqZxAn4ykbOxaJVWLS2sh_sZdx3WqlD8-
{
"name": "Nuffle Labs",
"website": "https://nuff.tech",
"description": "Nuffle Labs",
"logo": "https://nuff.tech/wp-content/uploads/2024/05/image-8.png",
"twitter": "https://twitter.com/nufflelabs"
}

Contributor Agreement

All contributors to ChakraCore must digitally sign the below agreement unless their contribution is done on behalf of or licensed to Microsoft and covered by the MIT license in LICENSE.txt

To sign the agreement please submit a commit that adds your name and github username to the bottom of this file.

Anything I contribute to the ChakraCore repository is comprised of one or more of:

  1. My own work
  2. Existing code from the ChakraCore repository
2020-09-25 12:11:59,223 [INFO] [tc-okhttp-stream-1642356615] i.k.j.q.e.TestAssistantE2EKaaV1Next.accept(Slf4jLogConsumer.java:64) - STDOUT: inflating: jbt-agent-next/lib/libQt5HttpServer.so.5.15.0
2020-09-25 12:11:59,223 [INFO] [tc-okhttp-stream-1642356615] i.k.j.q.e.TestAssistantE2EKaaV1Next.accept(Slf4jLogConsumer.java:64) - STDOUT: inflating: jbt-agent-next/lib/libQt5Mqtt.so
2020-09-25 12:11:59,223 [INFO] [tc-okhttp-stream-1642356615] i.k.j.q.e.TestAssistantE2EKaaV1Next.accept(Slf4jLogConsumer.java:64) - STDOUT: inflating: jbt-agent-next/lib/libQt5Mqtt.so.5
2020-09-25 12:11:59,223 [INFO] [tc-okhttp-stream-1642356615] i.k.j.q.e.TestAssistantE2EKaaV1Next.accept(Slf4jLogConsumer.java:64) - STDOUT: inflating: jbt-agent-next/lib/libQt5Mqtt.so.5.15
2020-09-25 12:11:59,223 [INFO] [tc-okhttp-stream-1642356615] i.k.j.q.e.TestAssistantE2EKaaV1Next.accept(Slf4jLogConsumer.java:64) - STDOUT: inflating: jbt-agent-next/lib/libQt5Mqtt.so.5.15.0
2020-09-25 12:11:59,235 [INFO] [tc-okhttp-stream-16423566
@Fly-Style
Fly-Style / cv.tex
Last active September 19, 2020 12:24
My CV
\documentclass[16pt,a3paper,roman]{moderncv}
% external packages
% \usepackage[12pt]{extsizes}
% modern themes
\moderncvstyle{banking}
\moderncvcolor{green}
\nopagenumbers{}
@Fly-Style
Fly-Style / latency.txt
Created January 4, 2020 18:20 — forked from jboner/latency.txt
Latency Numbers Every Programmer Should Know
Latency Comparison Numbers (~2012)
----------------------------------
L1 cache reference 0.5 ns
Branch mispredict 5 ns
L2 cache reference 7 ns 14x L1 cache
Mutex lock/unlock 25 ns
Main memory reference 100 ns 20x L2 cache, 200x L1 cache
Compress 1K bytes with Zippy 3,000 ns 3 us
Send 1K bytes over 1 Gbps network 10,000 ns 10 us
Read 4K randomly from SSD* 150,000 ns 150 us ~1GB/sec SSD
@Fly-Style
Fly-Style / CLA.md
Last active November 18, 2019 14:43

DataKernel Contributor License Agreement

Thank you for your interest in DataKernel.

In order to clarify the intellectual property license granted with Contributions from any person or entity, the Company must have a Contributor License Agreement ("CLA") on file that has been signed by each Contributor, indicating agreement to the license terms below. This license is for your protection as a Contributor as well as the protection of the Company and its users; it does not change your rights to use your own Contributions for any other purpose.

You accept and agree to the following terms and conditions for Your present and future Contributions submitted to the Company. In return, the Company shall not use Your Contributions in a way that is contrary to the public benefit or inconsistent with its bylaws in effect at the time of the Contribution. Except for the license granted herein to the Company and recipients of software distributed by the Company, You reserve all right, title, and interest in and to Your

string longestCommonPrefix(vector<string>& strs) {
string s = strs[0];
for(int i = 1; i < strs.size(); ++i) {
for (int j = 0; j < s.size(); ++j) {
if (strs[i][j] != s[j]) {
s = s.substr(0, j);
}
}
}
return s;
// zeroes to right
void move_zeroes(vector<int> &arr) {
int i = 0;
for(auto el : arr) {
if (el != 0) {
arr[i++] = el;
}
}
for(int k = i; k < arr.size())
arr[k] = 0;