Skip to content

Instantly share code, notes, and snippets.

@jamescalam
Last active April 13, 2024 11:46
Show Gist options
  • Star 11 You must be signed in to star a gist
  • Fork 2 You must be signed in to fork a gist
  • Save jamescalam/55daf50c8da9eb3a7c18de058bc139a3 to your computer and use it in GitHub Desktop.
Save jamescalam/55daf50c8da9eb3a7c18de058bc139a3 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@PiGnotus
Copy link

Hi there and thank you for your awesome example!
While trying to run this in colab (free edition), on the fine-tunning block my session keeps crashing as I run out of RAM.
Any ideas on how to solve this issue?
Thank you in advance, and again thank you for your great example!

@k-praveen-trellis
Copy link

reduce batch size

@ash-rulz
Copy link

ash-rulz commented Dec 5, 2023

Is a full fine-tuning happening here? I mean are all the weights and biases of the base model getting updated here?

@dibyendubiswas1998
Copy link

yes. It was updated according to given datasets

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment