Skip to content

Instantly share code, notes, and snippets.

View manueldeprada's full-sized avatar

Manuel de Prada manueldeprada

View GitHub Profile
@manueldeprada
manueldeprada / xdg-screen-cast.py
Created November 19, 2023 02:59
modified scg screen cast from python for screen resolution fetching
#!/usr/bin/python3
import re
import signal
import dbus
from gi.repository import GLib
from dbus.mainloop.glib import DBusGMainLoop
import gi
gi.require_version('Gst', '1.0')
@manueldeprada
manueldeprada / README.md
Last active November 15, 2023 17:20
Running QAFactEval in 2023

First, it is you need to create a conda or venv environment. This mamba.yml file contains the packages:

name: QAFactEval
channels:
- conda-forge
dependencies:
- python=3.8.18=hd12c33a_0_cpython
- spacy=2.2.4 
- spacy-model-en_core_web_sm=2.2.5
@manueldeprada
manueldeprada / example.py
Created July 27, 2023 16:07
Minimal example for Decoders usage
from decoders import inject_supervitamined_decoders, StochasticBeamSearchDecoder, FakeTransformer
from transformers import T5ForConditionalGeneration, T5Tokenizer
import torch
# pip install decoders
# this demonstration uses a fake toy transformer (https://manueldeprada.com/blog/posts/toy-probabilistic-transformer/)
# to test the correctness of the stochastic beam search implementation
def test_fake_transformer():
@manueldeprada
manueldeprada / fake_transformer.py
Last active July 27, 2023 10:34
A Toy Transformer for Debugging Generation Algorithms in HuggingFace🤗
from transformers import PreTrainedModel, PretrainedConfig, PreTrainedTokenizer, BatchEncoding
from transformers.modeling_outputs import Seq2SeqLMOutput
import torch
class FakeTransformerConfig(PretrainedConfig):
model_type = "FakeTransformer"
def __init__(self, vocab_size=4, **kwargs):
super().__init__(pad_token_id=-1, eos_token_id=3, bos_token_id=0, **kwargs)