Skip to content

Instantly share code, notes, and snippets.

View LucaCtt's full-sized avatar
😵‍💫

Luca Cotti LucaCtt

😵‍💫
View GitHub Profile
def bedrock_llm(model: str, temperature: float) -> BaseChatModel:
import boto3 # type: ignore[attr-defined]
from botocore.config import Config # type: ignore[attr-defined]
from langchain_aws import ChatBedrockConverse # type: ignore[import]
sts_client = boto3.client(
"sts",
config=Config(read_timeout=300),
aws_access_key_id=os.environ["AWS_ACCESS_KEY_ID"],
aws_secret_access_key=os.environ["AWS_SECRET_ACCESS_KEY"],
# Start from the official Ollama image
FROM ollama/ollama:latest
# Use root to install the model
USER root
# Preload the Qwen2.5 Coder 32B model (will be stored in /root/.ollama by default)
RUN ollama pull snowflake-arctic-embed:110m
RUN ollama pull qwen2.5:32b
RUN ollama pull qwen2.5-coder:32b
@LucaCtt
LucaCtt / bootstrap.sh
Last active December 27, 2019 21:28
This is a WIP script for automating the installation of my customized version of Arch Linux.
#!/bin/bash
#
# Author: Luca Cotti <lucacotti@outlook.com>
REPO='https://github.com/LucaCtt/dotfiles'
USERNAME='Luca Cotti'
EMAIL='lucacotti@outlook.com'
PKG_LIST="$HOME/pkglist.txt"
trap INT