Skip to content

Instantly share code, notes, and snippets.

@semra06
semra06 / pytorch-checkpoint.py
Created July 24, 2025 16:22 — forked from halitkalayci/pytorch-checkpoint.py
Python PyTorch ile Checkpointle Eğitim
from transformers import Seq2SeqTrainer, Seq2SeqTrainingArguments, AutoModelForSeq2SeqLM, AutoTokenizer
from datasets import load_dataset
model_name = "t5-small"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
dataset = load_dataset("xsum") # Örnek: haber özeti
def preprocess(example):
inputs = tokenizer("summarize: " + example["document"], truncation=True, padding="max_length", max_length=512)