Last active
April 13, 2024 11:46
-
-
Save jamescalam/55daf50c8da9eb3a7c18de058bc139a3 to your computer and use it in GitHub Desktop.
reduce batch size
Is a full fine-tuning happening here? I mean are all the weights and biases of the base model getting updated here?
yes. It was updated according to given datasets
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi there and thank you for your awesome example!
While trying to run this in colab (free edition), on the fine-tunning block my session keeps crashing as I run out of RAM.
Any ideas on how to solve this issue?
Thank you in advance, and again thank you for your great example!