Skip to content

Instantly share code, notes, and snippets.

View kfsone's full-sized avatar

Oliver Smith kfsone

View GitHub Profile
@kfsone
kfsone / preceeding python
Created February 6, 2024 23:06
I DONT WANNA WRITE COMMENTS
Write a rust program that takes counts the number of non-whitespace characters outside of c++ style comments in files. Count stdin unless file name arguments are supplied. Output only the total count, unless "--verbose" or "-v" is specified as an argument, in which case list the count per file, followed by the total.
Run with no arguments, and 0 characters: output = "0"
Run with "-v", 0 files, 0 characters: output = "<stdin>: 0"
Run with 0-byte files file1, file2, file3: output = "0"
Run with 0-byte files file1, file2, file3, and "-v": output = "file1: 0
file2: 0
file3: 0\*total:
"
@kfsone
kfsone / gist:fd27109635ede39d50b31115f3932f04
Created February 5, 2024 21:38
Autogen exercise using autobuild to write rust
building_task = """
Generate a set of agents to design, develop, document and test a complete rust program minus version control.
Ensure there is an agent who writes any generated command/configuration/scripts to disk, and a separate, dedicated executor agent who runs commands via docker, so that the agents cannot simply pretend to be writing code.
"""
agent_list, agent_configs = builder.build(building_task, llm_config)
@kfsone
kfsone / research.py
Created December 1, 2023 22:48
Product-Research-Agent-Team
import autogen
from textwrap import dedent
import os
try:
del os.environ['OPENAI_API_BASE']
except:
pass
os.putenv("OPENAI_API_BASE", "http://localhost:5000")
(venv) root@b7e76e6d82d8:/workspace# echo $OPENAI_API_BASE
http://localhost:11434/
(venv) root@b7e76e6d82d8:/workspace# litellm --test --port ${LITELLM_PORT}
LiteLLM: Making a test ChatCompletions request to your proxy
An error occurred: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: none. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
Debug this by setting `--debug`, e.g. `litellm --model gpt-3.5-turbo --debug`
INFO: 127.0.0.1:35604 - "POST /chat/completions HTTP/1.1" 200 OK
@kfsone
kfsone / Dockerfile
Last active November 16, 2023 06:53
WIP Dockerfile for litellm + ollama + memgpt + autogen + jupyter
# Work in progress. V0.1
# ALL THE THINGS.
ARG APT_PROXY #=http://apt-cacher-ng.lan:3142/
ARG PIP_INDEX_URL #=http://devpi.lan:3141/root/pypi/+simple
ARG PIP_TRUSTED_HOST #=devpi.lan
ARG JUPYTER_PORT=37799
ARG LITELLM_PORT=11111
FROM nvidia/cuda:11.8.0-devel-ubuntu22.04 as build-llama
@kfsone
kfsone / morpheddircopy.py
Created November 2, 2023 18:55
MorphedDirCopy.py
# Credit: AutoGPT using OpenAI gpt4
import argparse
import json
import os
import random
import shutil
from pathlib import Path
from random import randint
@kfsone
kfsone / BuildFromWxFromSource.ps1
Created May 24, 2023 21:09
Build wxWidgets by fetching from source.
<#
.Synopsis
CMake-Build produces reproducible builds of wxWidgets binaries.
.Description
The CMake-Build script can fetch latest or a specific hash of the
wxWidgets library source from the official github, or it can use
an existing local source directory.
It will configure it for Nuo use, build the binaries, and then
install them in-tree to your Nuo checkout.
@kfsone
kfsone / checklist.py
Last active September 7, 2023 14:52
ChecklistTracker for rich.
from __future__ import annotations
from typing import List, Optional
from rich.console import Console, ConsoleOptions, RenderableType
from rich.live import Live
from rich.spinner import Spinner
from rich.table import Table
from rich.text import Text