Skip to content

Instantly share code, notes, and snippets.

@arthrod
arthrod / pydantic-ai_summary.md
Created June 6, 2025 04:01 — forked from erikj27/pydantic-ai_summary.md
Documentation for pydantic-ai

Repository Overview

Directory Structure

pydantic-ai
├── LICENSE
├── Makefile
├── README.md
├── docs
│   ├── _worker.js
import polars as pl
from datasets import load_from_disk
ds = load_from_disk("/home/arthrod/finlandia/form-d-a/edgar_filings/train/")
arrow_table = ds.data.table
lazy_df = pl.from_arrow(arrow_table)
filtered_lazy = lazy_df.filter(pl.col("form").eq("D/A")).limit(1000)

Claude MCP Configuration Solutions

This gist documents two working approaches for Model Context Protocol (MCP) server configurations in Claude Desktop environments.

Solution 1: Shell Script Method

This approach uses a shell script to execute commands and works in Claude Desktop and Cursor.

Configuration in claude_desktop_config.json:

#!/usr/bin/env python3
# /// script
# requires-python = ">=3.12"
# dependencies = [
# "typer",
# "rich",
# ]
# ///
"""
JSONL Processor - A tool to modify JSONL files by adding or renaming keys, and merging multiple files.
# /// script
# requires-python = ">=3.12"
# dependencies = [
# "typer",
# "datasets",
# "loguru",
# "google-genai",
# "rich",
# ]
# ///

LangGraph ReAct Agent – Comprehensive Guide

This document consolidates all core instructions and examples for using and extending LangGraph’s prebuilt ReAct agent. It covers the following topics, along with complete code examples (using triple backticks) and the names of the required packages:

  • Using the Prebuilt ReAct Agent
  • Adding Thread-Level Memory
  • Adding a Custom System Prompt
  • Returning Structured Output
  • Adding Semantic Search to Your Agent’s Memory
  • Running a Graph Asynchronously
@arthrod
arthrod / agent.py
Created February 28, 2025 06:08 — forked from davidgasquez/agent.py
Gemini Script Builder
# #!/usr/bin/env -S uv run --script
# /// script
# dependencies = [
# "pydantic-ai",
# "python-dotenv",
# "typer",
# "rich",
# ]
# ///
@arthrod
arthrod / config.yaml.j2
Created February 24, 2025 01:01 — forked from wsargent/config.yaml.j2
LiteLLM install with ansible playbook
# https://docs.litellm.ai/docs/proxy/configs
# https://docs.litellm.ai/docs/proxy/quickstart
model_list:
# https://docs.lambdalabs.com/public-cloud/lambda-inference-api/#listing-models
- model_name: hermes3-70b
litellm_params:
model: openai/hermes3-70b
api_key: "os.environ/OPENAI_API_KEY"
api_base: "os.environ/OPENAI_API_BASE"
@arthrod
arthrod / llm_model_client.py
Created February 21, 2025 23:41
Elegant Openrouter Module for Calls (async, sync and log)
"""
Module for interacting with the model via the OpenRouter API.
"""
import toml
import asyncio
from loguru import logger
from openai import OpenAI
import requests
@arthrod
arthrod / hu_upload.py
Last active March 17, 2025 07:39
easy upload all jsonl files in a folder to huggingface_hub with typing and error free
# /// script
# requires-python = ">=3.12"
# dependencies = [
# "typer",
# "datasets",
# "loguru",
# "huggingface_hub",
# "google-genai",
# "aiofiles",
# "rich",