Skip to content

Instantly share code, notes, and snippets.

@cassc
cassc / non-trivial-tx-monitor.py
Last active August 23, 2024 05:30
script to monitor complex transactions
import csv
import argparse
from web3 import Web3
import os
from datetime import datetime
import time
WEB3_PROVIDER_URI = 'https://eth-mainnet.g.alchemy.com/v2/token'
@cassc
cassc / cargo.clippy.update.md
Created July 26, 2024 02:25
Update cargo clippy

Updating cargo clippy is straightforward as it is part of the Rust toolchain. Here are the steps to ensure you have the latest version of cargo clippy:

Steps to Update Cargo Clippy

  1. Update Rustup: Ensure that rustup, the Rust toolchain installer, is up to date. Open your terminal and run:

    rustup self update
@cassc
cassc / similarity-chatgpt.py
Created July 22, 2024 08:28
Get embeddings and calculation similarity
from openai import OpenAI
from sklearn.metrics.pairwise import cosine_similarity
import numpy as np
client = OpenAI()
def cal_embedding(text, model="text-embedding-ada-002"):
text = text.replace("\n", " ")
embedding = client.embeddings.create(input = [text], model=model).data[0].embedding
return np.reshape(embedding, (1, -1))
@cassc
cassc / fine-tune.model.md
Created July 18, 2024 09:54
Fine tune a model unsupervised learning

Fine-tuning a pre-trained model like bigcode/starencoder on a large collection of Solidity source code without any labeling can be done through unsupervised learning, specifically using masked language modeling (MLM). Here’s a step-by-step guide to fine-tuning the model for your specific needs:

Steps to Fine-Tune the Model

1. Prepare the Dataset

Ensure you have a large collection of Solidity source code files. Combine these files into a single or multiple text files.

Example: Combining Solidity Files into a Text File

cat *.sol > all_solidity_code.txt
@cassc
cassc / zero.shot.example.md
Created July 18, 2024 07:03
zero-shot example

Zero-shot examples refer to scenarios where a model is tasked with performing a specific task without having been explicitly trained on any data for that task. Instead, the model leverages its understanding from pre-training on other data to make predictions or inferences about the new task. This ability is particularly valuable for handling tasks or categories that were not seen during the model's training phase.

Zero-Shot Examples in Various Domains

1. Text Classification:

A model trained on general text data can classify text into categories it has never seen before by understanding the general concept of the categories.

  • Example:
    • Input Text: "The weather today is sunny with a chance of showers in the evening."
  • New Categories: "Weather Report" vs. "Financial News"
@cassc
cassc / langchain-no-memory.py
Created June 13, 2024 07:27
Langchain ChatGPT no memory demo
from typing import List
from pydantic import BaseModel, Field
from langchain_core.globals import set_debug
from langchain_core.output_parsers.json import JsonOutputParser
from langchain_core.prompts.prompt import PromptTemplate
from langchain_openai import ChatOpenAI
from langchain_community.callbacks import get_openai_callback
from langchain.chains import ConversationChain
from langchain_community.llms import OpenAI
@cassc
cassc / langchain-converstation-with-memory.py
Created June 13, 2024 07:15
Langchain ChatGPT converstation with memory
from langchain_core.globals import set_debug
from langchain_openai import ChatOpenAI
from langchain_community.callbacks import get_openai_callback
from langchain.chains import ConversationChain
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory, ConversationSummaryBufferMemory
set_debug(False)
model_name = 'gpt-4o'
@cassc
cassc / tokio.md
Last active June 3, 2024 03:02
tokio blocking io

tokio vs threadpool

Tokio is an asynchronous runtime for the Rust programming language, and it is more than just thread pools. Here's a brief overview of how Tokio is implemented:

  1. Event Loop: Tokio uses an event-driven architecture, leveraging an event loop to manage tasks and I/O events. The core of Tokio's runtime is the mio library, which provides a low-level event loop backed by OS-specific mechanisms like epoll (Linux), kqueue (BSD), and IOCP (Windows).

  2. Futures and Tasks: Tokio's concurrency model is based on Rust's Future trait. Tasks are units of work that implement the Future trait, and they are polled by the runtime to make progress. When a task is not ready to make progress (e.g., waiting for I/O), it yields control back to the runtime.

  3. Thread Pool:

@cassc
cassc / selfdestruct-gas-and-stack.json
Created May 16, 2024 07:47
selfdestruct gas and stack value mismatches
{
"00000014-mixed-4": {
"env": {
"currentCoinbase": "b94f5374fce5edbc8e2a8697c15331677e6ebf0b",
"currentDifficulty": "0x200000",
"currentRandom": "0x0000000000000000000000000000000000000000000000000000000000200000",
"currentGasLimit": "0x26e1f476fe1e22",
"currentNumber": "0x1",
"currentTimestamp": "0x3e8",
"previousHash": "0x044852b2a670ade5407e78fb2863c51de9fcb96542a07186fe3aeda6bb8a116d",
@cassc
cassc / sstore-gas.json
Created May 16, 2024 07:41
sstore gas
{
"00000007-mixed-4": {
"env": {
"currentCoinbase": "b94f5374fce5edbc8e2a8697c15331677e6ebf0b",
"currentDifficulty": "0x200000",
"currentRandom": "0x0000000000000000000000000000000000000000000000000000000000200000",
"currentGasLimit": "0x26e1f476fe1e22",
"currentNumber": "0x1",
"currentTimestamp": "0x3e8",
"previousHash": "0x044852b2a670ade5407e78fb2863c51de9fcb96542a07186fe3aeda6bb8a116d",