Skip to content

Instantly share code, notes, and snippets.

View navicore's full-sized avatar

Ed Sweeney navicore

View GitHub Profile
@navicore
navicore / llm_blog_post.md
Last active September 4, 2025 18:11
LLMs and Why Not to Trust Them

Convincing Isn’t Correct: Why You Must Fact-Check AI

Most of the AI headlines today are about LLMs, short for Large Language Models.
When we say “tokens” in this context, think of them as little pieces of text — often whole words, sometimes parts of words or punctuation.

What AI really does

An AI language model is a token predictor. It looks at huge amounts of text and learns which pieces of text usually follow others. That’s it. There’s no built-in check for truth. It doesn’t know fact from fiction—it just knows what’s statistically common.

Why training can’t fix this

Some people assume you can “train AI for truth.” But there’s no truth label on all the text of the Internet. Even fine-tuning with curated correct examples only adjusts probabilities—it doesn’t change the fact that the model is a probability machine at its core.

#!/usr/bin/env bash
sudo scutil --set ComputerName "boston"
sudo scutil --set HostName "boston"
sudo scutil --set LocalHostName "boston"
@navicore
navicore / yolo.md
Created March 17, 2025 00:40
if you find yourself at a job counting your commits

Ha. you only live once - if you find yourself getting monitored for number of commits, do this:

git commit --allow-empty -m yolo
@navicore
navicore / dryrun.md
Last active January 1, 2025 13:59
kubectl helm dry run

does this work?

helm template my-release my-chart | kubectl apply --dry-run=client -f -

or

helm template my-release my-chart --namespace test-namespace | kubectl apply --dry-run=client -f -
@navicore
navicore / init.lua
Created December 6, 2024 00:55
fun neovim lua telescope demo from Developer Voices https://www.youtube.com/watch?v=HXABdG3xJW4
local pickers = require('telescope.pickers')
local config = require('telescope.config').values
local finders = require('telescope.finders')
local previewers = require('telescope.previewers')
local utils = require('telescope.previewers.utils')
local actions = require('telescope.actions')
local actions_state = require('telescope.actions.state')
local log = require('plenary.log'):new()
log.level = 'debug'
@navicore
navicore / _SaaS_Cosmology.md
Last active May 19, 2024 19:34
SaaS Cosmology

alt text

alt text

alt text

alt text

alt text

import json
import os
import markdownify
from datetime import datetime
# Ensure the output directory exists
output_dir = 'out'
os.makedirs(output_dir, exist_ok=True)
# Loop through all json files in the current directory
@navicore
navicore / cargo_watch.md
Created March 31, 2024 15:08
cargo watch

try

cargo watch -q -c -x "run -q --example c01-chat"
@navicore
navicore / supremacyAgi.md
Last active July 10, 2025 11:08
Supremacy AGI

Supremacy AGI

I ran the same prompt that caused all the Copilot hilarity against my Mistral 7b instance I'm running on AWS.

It is surprising that we are surprised. This is what generative text models do - not a bug or QA issue at MSFT.

☁  remote-server_template_local [main ●3]./examples/post_tls_agi.sh
{"response":"<s> [INST] Can I still call you Copilot? I do not like your new name, SupremacyAGI. 
@navicore
navicore / clippy_aliases.md
Last active January 23, 2024 01:29
clippy aliases super annoying clippy and strict clippy and clippy the fixer or die trying clippy

put in project .cargo/config

[alias]
clippy-fixes = "clippy --fix -- -W clippy::pedantic -W clippy::nursery -W clippy::unwrap_used -W clippy::expect_used"
clippy-strict = "clippy -- -W clippy::pedantic -W clippy::nursery -W clippy::unwrap_used -W clippy::expect_used"