Skip to content

Instantly share code, notes, and snippets.

View abacaj's full-sized avatar
💭
Writing more code

Anton Bacaj abacaj

💭
Writing more code
View GitHub Profile
<!DOCTYPE html>
<html class="no-js" lang="">
<head>
<meta charset="utf-8" />
<title></title>
<meta name="description" content="" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta property="og:title" content="" />
<meta property="og:type" content="" />
@abacaj
abacaj / long_gpt.py
Created April 30, 2023 21:16 — forked from NaxAlpha/long_gpt.py
Training script for LongGPT; Fine-tunes GPT-2 (335M) on The Pile Dataset with a context size of 8k tokens. (requires > 16GB RAM)
import time
from contextlib import suppress
import torch
import torch.nn as nn
import torch.optim as optim
import torch.nn.functional as F
import torch.backends.cuda as cuda
from torch.utils.data import DataLoader, IterableDataset
@abacaj
abacaj / fused_false.jsonl
Created September 27, 2023 01:18
AutoAWQ fusing
{"completion": "# Write a python function to loop to 1000\n\ndef loop_to_1000():\n for i in range(1000):\n print(i)\n\n\nloop_to_1000()\n"}
{"completion": "# Write a python function to loop to 1000\n\ndef loop_to_1000():\n for i in range(1000):\n print(i)\n\n\nloop_to_1000()\n"}
{"completion": "# Write a python function to loop to 1000\n\ndef loop_to_1000():\n for i in range(1000):\n print(i)\n\n\nloop_to_1000()\n"}
{"completion": "# Write a python function to loop to 1000\n\ndef loop_to_1000():\n for i in range(1000):\n print(i)\n\n\nloop_to_1000()\n"}
{"completion": "# Write a python function to loop to 1000\n\ndef loop_to_1000():\n for i in range(1000):\n print(i)\n\n\nloop_to_1000()\n"}
@abacaj
abacaj / humaneval_m7x8.jsonl
Created December 9, 2023 01:04
Results from running "mistral-8x7B" on humaneval (code benchmark)
{"task_id": "HumanEval/0", "prompt": "from typing import List\n\n\ndef has_close_elements(numbers: List[float], threshold: float) -> bool:\n \"\"\" Check if in given list of numbers, are any two numbers closer to each other than\n given threshold.\n >>> has_close_elements([1.0, 2.0, 3.0], 0.5)\n False\n >>> has_close_elements([1.0, 2.8, 3.0, 4.0, 5.0, 2.0], 0.3)\n True\n \"\"\"\n", "canonical_solution": " for idx, elem in enumerate(numbers):\n for idx2, elem2 in enumerate(numbers):\n if idx != idx2:\n distance = abs(elem - elem2)\n if distance < threshold:\n return True\n\n return False\n", "test": "\n\nMETADATA = {\n 'author': 'jt',\n 'dataset': 'test'\n}\n\n\ndef check(candidate):\n assert candidate([1.0, 2.0, 3.9, 4.0, 5.0, 2.2], 0.3) == True\n assert candidate([1.0, 2.0, 3.9, 4.0, 5.0, 2.2], 0.05) == False\n assert candidate([1.0, 2.0, 5.9, 4.0, 5.0], 0.95) == True\n assert candidate([1.0, 2.0,
@abacaj
abacaj / streaming_tokens.py
Last active April 8, 2024 14:11
Stream HF transformer token generation
from queue import Queue
from threading import Thread
import transformers
import torch
class TextIteratorStreamer:
def __init__(
self, tokenizer
):
self.tokenizer = tokenizer