Skip to content

Instantly share code, notes, and snippets.

View stevenkolawole's full-sized avatar

Steven Kolawole stevenkolawole

View GitHub Profile
@stevenkolawole
stevenkolawole / output.log
Created December 30, 2023 08:04
MistralAttention's CUDA Error
python my_main.py --model mistralai/Mistral-7B-v0.1 --save out --masks_per_iter 100
torch 1.10.1
transformers 4.36.1
accelerate 0.25.0
# of gpus: 1
Namespace(model='mistralai/Mistral-7B-v0.1', seed=0, nsamples=14, sparsity_ratio=0.5, prune_frac=0.1, bsz=14, mlp_attn_ratio=1.0, prune_method='magnitude', cache_dir='llm_weights', use_variant=False, save='out', save_model=None, masks_per_iter=100, tol=0.02, sm_reg_weight='[1e2, 1e-4, 0]', sm_lr_factor='[100, 10, 1, 0.1]', sm_reg_type='l1', sm_lin_model_type='global', sm_bsz='[32, 64, 128]', sm_nepochs=50, wandb_project_name='Prune-No-Backward')
wandb: Currently logged in as: skolawol (shapelyprune). Use `wandb login --relogin` to force relogin
wandb: Tracking run with wandb version 0.16.1
2023-02-09 11:38:54,186 INFO:Run name: EUR-Lex_bert_20230209113854
2023-02-09 11:38:54,800 INFO:Created a temporary directory at /tmp/tmp5_bwr35i
2023-02-09 11:38:54,800 INFO:Writing /tmp/tmp5_bwr35i/_remote_module_non_scriptable.py
2023-02-09 11:38:55,241 INFO:Global seed set to 1337
2023-02-09 11:38:55,246 INFO:Using device: cuda
2023-02-09 11:38:56,089 INFO:Load data from data/EUR-Lex/train.txt.
2023-02-09 11:38:56,930 INFO:Load data from data/EUR-Lex/test.txt.
2023-02-09 11:38:57,126 INFO:Finish loading dataset (train: 12359 / val: 3090 / test: 3865)
2023-02-09 11:38:57,128 INFO:Initialize model from scratch.
2023-02-09 11:38:57,136 INFO:Read 3956 labels.
@stevenkolawole
stevenkolawole / FedBE_error.txt
Last active September 19, 2022 13:49
`run.sh` error log
Traceback (most recent call last):
File "/usr/lib/python3.7/urllib/request.py", line 1350, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/usr/lib/python3.7/http/client.py", line 1281, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib/python3.7/http/client.py", line 1327, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/lib/python3.7/http/client.py", line 1276, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib/python3.7/http/client.py", line 1036, in _send_output
name: Build Docker image and deploy to Heroku
on:
# Trigger the workflow on push or pull request,
# but only for the main branch
push:
branches:
- main
jobs:
build:
# Grab the Python slim image
FROM python:3.8-slim
# Install python and pip
COPY requirements.txt ./requirements.txt
# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Add our code
# Grab the Python base image
FROM python:3.8
# Install python and pip
COPY requirements.txt ./requirements.txt
# Install dependencies
RUN pip install -r requirements.txt
# Add our code
import numpy as np
import plotly.express as px
import streamlit as st
st.title('Distribution Tester')
st.write('Pick a distribution from the list and we shall draw \
the line chart from a random sample from the distribution')
keys = ['Normal','Uniform']
dist_key = st.selectbox('Which distribution do you want to plot?', keys)
import pyttsx3
engine = pyttsx3.init()
engine.say("My guy!")
engine.runAndWait()