Skip to content

Instantly share code, notes, and snippets.

View pommedeterresautee's full-sized avatar

Michaël Benesty pommedeterresautee

View GitHub Profile
@pommedeterresautee
pommedeterresautee / dataset.py
Last active August 24, 2021 05:00
Decrease Hugging Face Transformers training times by 2 - dataset
def load_train_data(path: str, sort: bool) -> List[Example]:
sentences = list()
with open(path) as f:
first = False
for line in f:
if not first:
first = True
continue
text_a, text_b, label = line.rstrip().split("\t")
lab = len(text_a) + len(text_b)
@pommedeterresautee
pommedeterresautee / collator.py
Last active May 20, 2020 07:02
Decrease Hugging Face Transformers training times by 2 - collator
# ...
def pad_seq(seq: List[int], max_batch_len: int, pad_value: int) -> List[int]:
# IRL, use pad_sequence
# https://pytorch.org/docs/master/generated/torch.nn.utils.rnn.pad_sequence.html
return seq + (max_batch_len - len(seq)) * [pad_value]
@dataclass
class SmartCollator(DataCollator):
@pommedeterresautee
pommedeterresautee / Makefile
Last active January 21, 2023 17:36
Divide Hugging Face Transformers training times by 2 or more with dynamic padding and uniform length batching
# required by (\ SHELL COMMANDS \)
SHELL:=/bin/bash
VIRT_ENV_FOLDER = ~/.local/share/virtualenvs/xnli
SOURCE_VIRT_ENV = source $(VIRT_ENV_FOLDER)/bin/activate
.PHONY: train
train:
( \
@pommedeterresautee
pommedeterresautee / Activity1.java
Last active April 11, 2017 15:10
Here is a simple implementation of a wrapper to execute Observable in a dedicated Fragment. The main purpose is to manage screen rotation during the Async execution of an Observable. In my application several Activities implement the Observer Interface, without Fragment, so this implementation is built with that in mind. Of course, it can be upd…
package com.pommedeterresautee.rxtest;
import android.content.Intent;
import android.os.Bundle;
import android.app.Activity;
import android.view.Menu;
import android.widget.TextView;
import rx.Observer;