Skip to content

Instantly share code, notes, and snippets.

View harisrab's full-sized avatar
🐝
Bee work.

Haris Rashid harisrab

🐝
Bee work.
View GitHub Profile
// ==UserScript==
// @name intercept_attributed_orders
// @namespace http://tampermonkey.net/
// @version 2024-01-06
// @description try to take over the world!
// @author You
// @match https://app.triplewhale.com/attribution/*
// @icon https://www.google.com/s2/favicons?sz=64&domain=triplewhale.com
// @grant none
// ==/UserScript==
// ==UserScript==
// @name get-campaign-name
// @namespace http://tampermonkey.net/
// @version 0.1
// @description Save content of clicked table row into the global scope
// @author You
// @match https://app.triplewhale.com/attribution/*
// @icon https://www.google.com/s2/favicons?sz=64&domain=triplewhale.com
// @grant none
// ==/UserScript==
@harisrab
harisrab / OFAgentStreamingCallback.py
Created July 13, 2023 15:28
Langchain OpenAIFunctions Streaming Callback
"""
Remember this callback just shows how to perform streaming. There may appear some useless code in here, but in our implementation, its being passed to other callbacks.
This callbacks is basically the one that performs the talking to frontend. Any new object in the event queue is sent to the frontend.
Action from agents, tools, datasources, etc, are streamed through this callback. Therefore, you may see other variables that may not be necessary for you. So keep an eye out for those.
"""
class OFAgentStreamingCallback(StreamingStdOutCallbackHandler):
"""Callback handler for streaming in agents.
@harisrab
harisrab / serpapi_citation.txt
Last active July 9, 2023 04:55
CustomSerpAPIWrapper with Citation
import json
import os
import sys
from typing import Any, Dict, Optional, Tuple
import aiohttp
from langchain.utils import get_from_dict_or_env
from pydantic import BaseModel, Extra, Field, root_validator
from CustomCallbacks import FinalStreamingStdOutCallbackHandler
@harisrab
harisrab / agent_streaming_callback.py
Created July 1, 2023 08:05
Callback for Streaming Langchain Agent and stopping the stream when Last LLM call finishes
DEFAULT_ANSWER_PREFIX_TOKENS = ['AI', ':']
class FinalStreamingStdOutCallbackHandler(StreamingStdOutCallbackHandler):
"""Callback handler for streaming in agents.
Only works with agents using LLMs that support streaming.
Only the final output of the agent will be streamed.
"""
def append_to_last_tokens(self, token: str) -> None: