Skip to content

Instantly share code, notes, and snippets.

@ivanfioravanti
ivanfioravanti / gist:bcacc48ef68b02e9b7a4034161824287
Created January 25, 2024 00:01
Fine tuning dataset generation wiht Ollama python library
import json
import os
import ollama
def query_ollama(prompt, model='openhermes:7b-mistral-v2.5-q6_K', context=''):
response = ollama.generate(
model=model,
prompt=context + prompt)
return response['response'].strip()
@schacon
schacon / better-git-branch.sh
Created January 13, 2024 18:41
Better Git Branch output
#!/bin/bash
# Colors
RED='\033[0;31m'
GREEN='\033[0;32m'
NO_COLOR='\033[0m'
BLUE='\033[0;34m'
YELLOW='\033[0;33m'
NO_COLOR='\033[0m'
@spullara
spullara / chat
Last active September 12, 2025 17:37
Use this command to get suggestions on how to do things on the command line.
#!/bin/bash
TOKEN=< OpenAI token from https://platform.openai.com/account/api-keys >
PROMPT="You are the best at writing shell commands. Assume the OS is Mac OS. I want you to respond with only the shell commands separated by semicolons and no commentary. Here is what I want to do: $@"
RESULT=`curl -s https://api.openai.com/v1/chat/completions \
-H 'Content-Type: application/json' \
-H "Authorization: Bearer $TOKEN" \
-d "{
\"model\": \"gpt-4o\",
\"messages\": [{\"role\": \"user\", \"content\": \"$PROMPT\"}]
}" | jq '.choices[] | .message.content' -r`
@Enichan
Enichan / main.c
Last active January 19, 2025 17:27
Coroutines for C using Duff's Device based on Simon Tatham's coroutines
// coroutines for C adapted from Simon Tatham's coroutines
// https://www.chiark.greenend.org.uk/~sgtatham/coroutines.html
//
// this implementation utilizes __VA_ARGS__ and __COUNTER__
// to avoid a begin/end pair of macros.
#include <stdlib.h>
#include <stdio.h>
#include <stdbool.h>
// modify this if you don't like the `self->name` format