Skip to content

Instantly share code, notes, and snippets.

View timpulver's full-sized avatar

Tim Pulver timpulver

View GitHub Profile
@OrionReed
OrionReed / dom3d.js
Last active May 7, 2024 01:36
3D DOM viewer, copy-paste this into your console to visualise the DOM topographically.
// 3D Dom viewer, copy-paste this into your console to visualise the DOM as a stack of solid blocks.
// You can also minify and save it as a bookmarklet (https://www.freecodecamp.org/news/what-are-bookmarklets/)
(() => {
const SHOW_SIDES = false; // color sides of DOM nodes?
const COLOR_SURFACE = true; // color tops of DOM nodes?
const COLOR_RANDOM = false; // randomise color?
const COLOR_HUE = 190; // hue in HSL (https://hslpicker.com)
const MAX_ROTATION = 180; // set to 360 to rotate all the way round
const THICKNESS = 20; // thickness of layers
const DISTANCE = 10000; // ¯\\_(ツ)_/¯
@wilsonowilson
wilsonowilson / settings.json
Created January 2, 2024 14:56
Minimal Vscode w/ APC extension
{
"workbench.colorTheme": "Aura Dark",
"workbench.iconTheme": "material-icon-theme",
"editor.fontFamily": "'Geist Mono', Menlo, Monaco, 'Courier New', monospace",
"apc.listRow": {
"height": 24,
"fontSize": 11
},
"window.titleBarStyle": "native",
"apc.font.family": "Geist Mono",
@kamilogorek
kamilogorek / _screenshot.md
Last active May 2, 2024 13:48
Clutter-free VS Code Setup
image
@adrienbrault
adrienbrault / llama2-mac-gpu.sh
Last active April 22, 2024 08:47
Run Llama-2-13B-chat locally on your M1/M2 Mac with GPU inference. Uses 10GB RAM. UPDATE: see https://twitter.com/simonw/status/1691495807319674880?s=20
# Clone llama.cpp
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
# Build it
make clean
LLAMA_METAL=1 make
# Download model
export MODEL=llama-2-13b-chat.ggmlv3.q4_0.bin
@josephrocca
josephrocca / e5-large-v2.js
Last active October 25, 2023 01:29
e5-large-v2 text embedding model in JavaScript using Transformers.js
// See the comments at the end for a model that does much better than e5-large-v2 while being a third of the size.
let { pipeline } = await import('https://cdn.jsdelivr.net/npm/@xenova/transformers@2.7.0');
let extractor = await pipeline('feature-extraction', 'Xenova/e5-large-v2');
// Note: If you're just comparing "passages" with one another, then just prepend "passage: " to all texts. Only use "query: " if the text is a short "search query" like in the above example.
let passage1 = await extractor(`passage: The Shawshank Redemption is a true masterpiece of cinema.`, { pooling: 'mean', normalize: true });
let passage2 = await extractor(`passage: The film should not be exposed to sunlight when removing it from the wrapper. Otherwise your movie will come out bad.`, { pooling: 'mean', normalize: true });
let query = await extractor(`query: movie review`, { pooling: 'mean', normalize: true });
@rain-1
rain-1 / LLM.md
Last active May 5, 2024 07:13
LLM Introduction: Learn Language Models

Purpose

Bootstrap knowledge of LLMs ASAP. With a bias/focus to GPT.

Avoid being a link dump. Try to provide only valuable well tuned information.

Prelude

Neural network links before starting with transformers.

@Quasimondo
Quasimondo / sd15_vae_merge.py
Created October 22, 2022 15:23
Quick script to merge finetuned StabilityAI autoencoder into RunwayML Stable Diffusion 1.5 checkpoint
import torch
#USE AT YOUR OWN RISK
#local path to runwayML SD 1.5 checkpoint (https://huggingface.co/runwayml/stable-diffusion-v1-5)
ckpt_15 = "./v1-5-pruned-emaonly.ckpt"
#local path to StabilityAI finetuned autoencoder (https://huggingface.co/stabilityai/sd-vae-ft-mse)
ckpt_vae = "./vae-ft-mse-840000-ema-pruned.ckpt"
# This is a sample Python script.
# Press ⌃R to execute it or replace it with your code.
# Press Double ⇧ to search everywhere for classes, files, tool windows, actions, and settings.
import aiohttp
event_loop = None
import asyncio
from pprint import pprint
@ries9112
ries9112 / CV_pull_TheGraph
Created June 29, 2021 11:53
CV pull data using TheGraph
// add script to a button inside Cryptovoxels to return a response
feature.on('click',e=>{
// Use GraphQL endpoint to pull data and make model that takes x,y Decentraland coordinates and returns prediction based on the latest 10 sales!
fetch('https://api.thegraph.com/subgraphs/name/decentraland/marketplace', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
@ries9112
ries9112 / gist:450f9309cbb25768298d698d31e69537
Created June 6, 2021 16:48
Use this query to find all objkts in your collection - sorted by highest sales (both primary and secondary market). Replace 'String = "tz1c...' with your own Tezos wallet.
query findSecondarySales($address: String = "tz1c2iwyckUCcicx2qxqtwLEartYFEHg1pvB") {
hic_et_nunc_swap(where: {status: {_in: [1]}, token: {swaps: {trades: {buyer: {address: {_eq: $address}}}}}}, order_by: {price: desc}) {
price
status
token {
title
mime
description
id
artifact_uri