Skip to content

Instantly share code, notes, and snippets.

View vikramsoni2's full-sized avatar

Vikram vikramsoni2

  • Baxter International
  • Madrid
View GitHub Profile
@vikramsoni2
vikramsoni2 / uppyStore.js
Created May 12, 2024 01:16
Pinia Store for uppy js for keeping track of file upload state
import { defineStore } from 'pinia';
export const useUppyStore = defineStore({
id: 'uppyStore',
state: () => ({
state: {}
}),
actions: {
@vikramsoni2
vikramsoni2 / logging.py
Created May 8, 2024 16:39
asynchronous json logging inpython using QueueHandler and DictConfig
from logging.config import ConvertingList, ConvertingDict, valid_ident
from logging.handlers import QueueHandler, QueueListener
from queue import Queue
import atexit
import logging
def _resolve_handlers(l):
if not isinstance(l, ConvertingList):
return l
@vikramsoni2
vikramsoni2 / CircleProgress.vue
Created January 31, 2024 00:14
circular progress bar in vuejs
<script setup>
const props = defineProps({
progress: {
type: Number,
default: 0,
},
})
</script>
<template>
<svg viewBox="-6.25 -6.25 62.5 62.5" version="1.1" xmlns="http://www.w3.org/2000/svg" style="transform:rotate(-90deg)">
@vikramsoni2
vikramsoni2 / rag_fusion.py
Created January 28, 2024 22:23
reciprocal_rank_fusion with RAG
import os
import openai
import random
# Initialize OpenAI API
openai.api_key = os.getenv("OPENAI_API_KEY") # Alternative: Use environment variable
if openai.api_key is None:
raise Exception("No OpenAI API key found. Please set it as an environment variable or in main.py")
# Function to generate queries using OpenAI's ChatGPT
#pip install git+https://github.com/huggingface/transformers.git
import datetime
import sys
from transformers import pipeline
from transformers.pipelines.audio_utils import ffmpeg_microphone_live
pipe = pipeline("automatic-speech-recognition", model="openai/whisper-base", device=0)
sampling_rate = pipe.feature_extractor.sampling_rate
@vikramsoni2
vikramsoni2 / test.json
Created January 6, 2024 13:30
test.json
"BAX": {
"Slide9Unwanted_0": "The topics of interest about the company's outlook and prospects discussed in the question-and-answer session include:",
"Slide9Unwanted_1": "Some GPO contracts are coming up for renewal, with negotiations to reflect current costs that have increased in recent years.",
"Slide9Unwanted_2": "Temporary price increases during recent periods will be rolling off.",
"Slide9Unwanted_3": "Sequential improvement in bottom line will continue into 2024.",
"Slide9Unwanted_4": "Expectations for 2024 growth are around market expectations, with mid to long term expectations of 4% to 5%.",
"Slide9Unwanted_5": "New product launches from 2023 into 2024 in the pharmaceutical business.",
"Slide9Unwanted_6": "Significant progress in the proposed spin-off of the Kidney Care segment into its own company, Vantive.",
"Slide9Unwanted_7": "Significant operational efficiencies have been put in place, primarily in logistics and transportation an
@vikramsoni2
vikramsoni2 / memory.py
Created December 13, 2023 21:42
Memory module for LLMs to maintain conversation history during inference
import numpy as np
from pydantic import BaseModel
from utils.config import Config
from utils.models import ChatMessage
from tokenizers import Tokenizer
from typing import Dict, List, Optional, Union, Literal
DEFAULT_TOKEN_LIMIT_RATIO = 0.75
DEFAULT_TOKENIZER = Tokenizer.from_pretrained(Config.LLAMA_MODEL)
@vikramsoni2
vikramsoni2 / main.py
Created October 27, 2023 19:10 — forked from jvelezmagic/main.py
QA Chatbot streaming with source documents example using FastAPI, LangChain Expression Language, OpenAI, and Chroma.
"""QA Chatbot streaming using FastAPI, LangChain Expression Language , OpenAI, and Chroma.
Features
--------
- Persistent Chat Memory:
Stores chat history in a local file.
- Persistent Vector Store:
Stores document embeddings in a local vector store.
- Standalone Question Generation:
Rephrases follow-up questions to standalone questions in their original language.
@vikramsoni2
vikramsoni2 / wg-easy.service
Created September 17, 2023 21:41 — forked from WeslieDE/wg-easy.service
systemd servicefile for wg-easy
[Unit]
Description=WG-Easy Service
After=network.target
[Service]
ExecStart=/usr/bin/node /app/server.js
WorkingDirectory=/app
Restart=always
User=root
Group=nogroup
apt-get update && apt-get -y upgrade
apt-get install -y wireguard-tools dumb-init git
#Install NodeJS
curl -sL https://deb.nodesource.com/setup_16.x | bash -
apt-get update && apt-get -y upgrade
apt -y install nodejs
#Install WG Easy
git clone https://github.com/WeeJeWel/wg-easy WGEASY