Skip to content

Instantly share code, notes, and snippets.

View justinyanme's full-sized avatar

Justin Yan justinyanme

View GitHub Profile
@justinyanme
justinyanme / README.md
Created November 11, 2012 12:45 — forked from agnoster/README.md
My ZSH Theme

agnoster.zsh-theme

A ZSH theme optimized for people who use:

  • Solarized
  • Git
  • Unicode-compatible fonts and terminals (I use iTerm2 + Menlo)

For Mac users, I highly recommend iTerm 2 + Solarized Dark

### Keybase proof
I hereby claim:
* I am justinyanme on github.
* I am mapleshadow (https://keybase.io/mapleshadow) on keybase.
* I have a public key ASCIqso9qKZLJCfWQGemIdIv3wLqMtKQKDSsB41qzW1Udwo
To claim this, I am signing this object:
on alfred_script(q)
if ((q as string) is equal to "po") then
tell application "JustFocus"
launch
start pomodoro
end tell
else if ((q as string) is equal to "sb") then
tell application "JustFocus"
launch
short break
package main
import (
"fmt"
"reflect"
"strings"
)
type TestData struct {
ID int `tag1:"Tag1" tag2:"Tag2"`
@justinyanme
justinyanme / index.js
Last active September 4, 2022 10:07
AppleMusic.info List Analysis
// Script I used for this newsletter issue:
// 1. Save the html page from https://www.applemusic.info/
// 2. run `npm install`
// 3. run `node index.js`
// 4. check `out.json`
const fs = require("fs");
const cheerio = require("cheerio");
const _ = require("lodash");
@justinyanme
justinyanme / IntrospectWindowView.swift
Last active November 24, 2023 05:24
SwiftUI Notes 03
import SwiftUI
import AppKit
extension View {
public func inject<SomeView>(_ view: SomeView) -> some View where SomeView: View {
overlay(view.frame(width: 0, height: 0))
}
func getWindow(window: @escaping (_ window: NSWindow?) -> Void) -> some View {
# brew install figlet cowsay lolcat coreutils fortune
echo "Justin" | figlet -f starwars | lolcat --animate -s 1000
fortune -s | cowsay -f $(cd /opt/homebrew/Cellar/cowsay/3.04_1/share/cows && ls *.cow | shuf -n1) | lolcat --animate -s 1000

Plan: Add OpenAI-Compatible Chat Completions Endpoint to just-gemini

Context

Goal: Make just-gemini usable as an LLM backend by any OpenAI-compatible client (including justbot's future subagent system).

Approach: Add standard OpenAI-compatible POST /v1/chat/completions (streaming SSE) and GET /v1/models endpoints on top of the existing session-based architecture. This is the most widely supported LLM API format — any client that speaks OpenAI can use just-gemini without custom integration code.

Target OpenAI format:

  • POST /v1/chat/completions with { model, messages, stream: true, max_tokens? }