Created
January 28, 2023 01:34
-
-
Save xiong-jie-y/0d6566932404ecb011591d3f0506d43a to your computer and use it in GitHub Desktop.
2.1 FineTune Kohya
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
set -e | |
mkdir -p $3 | |
python finetune/merge_dd_tags_to_metadata.py $1 $3/meta_cap_dd.json | |
python finetune/clean_captions_and_tags.py $1 $3/meta_cap_dd.json $3/meta_clean.json | |
python finetune/prepare_buckets_latents.py \ | |
$1 $3/meta_clean.json $3/meta_lat.json \ | |
stabilityai/stable-diffusion-2-1 \ | |
--batch_size=12 \ | |
--max_resolution=768,768 \ | |
--mixed_precision=fp16 \ | |
--min_bucket_reso=64 \ | |
--max_bucket_reso=1536 \ | |
--v2 | |
accelerate launch --num_cpu_threads_per_process 8 fine_tune.py \ | |
--pretrained_model_name_or_path=stabilityai/stable-diffusion-2-1 \ | |
--train_data_dir=$1 \ | |
--output_dir=$2 \ | |
--shuffle_caption\ | |
--train_batch_size=28 \ | |
--in_json=$3/meta_lat.json \ | |
--learning_rate=5e-6 \ | |
--max_train_epochs=2 \ | |
--mixed_precision=fp16 \ | |
--gradient_checkpointing \ | |
--use_8bit_adam \ | |
--save_every_n_epochs=1 \ | |
--xformers \ | |
--v2 \ | |
--lr_scheduler=cosine_with_restarts |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment