v0.9.8.3
版本发布时间: 2024-08-19 06:36:11
bghira/SimpleTuner最新发布版本:v1.1.1(2024-10-05 08:37:33)
What's Changed
General
- Non-BF16 capable optimisers removed in favour of a series of new Optimi options
- new crop_aspect option
closest
that usescrop_aspect_buckets
as a list of options - fewer images are discarded, minimum image size isn't set by default for you any longer
- better behaviour with mixed datasets, more equally sampling large and small sets
- caveat dreambooth training now probably wants
--data_backend_sampling=uniform
instead ofauto-weighting
- caveat dreambooth training now probably wants
- multi-caption fixes, it was always using the first caption before (whoops)
- TF32 now enabled by default for users with
configure.py
- New arguments for
--custom_transformer_model_name_or_path
to use a flat repository or local dir containing just the transformer model - InvernVL captioning script contributed by @frankchieng
- ability to change constant learning rate on resume
- fix SDXL controlnet training, allowing it to work with quanto
- DeepSpeed fixes, caveat broken validations
Flux
- New LoRA targets
ai-toolkit
andcontext-ffs
, with context-ffs behaving more like text encoder training - New LoRA training resumption support via
--init_lora
- LyCORIS support
- Novel attention masking implementation via
--flux_attention_masked_training
thanks to @AmericanPresidentJimmyCarter (#806) - Schnell
--flux_fast_schedule
fixed (still not great)
Pull Requests
- Fix --input_perturbation_steps so that it actually has an effect by @mhirki in https://github.com/bghira/SimpleTuner/pull/772
- add
ai-toolkit
option in --flux_lora_target choices by @benihime91 in https://github.com/bghira/SimpleTuner/pull/773 - Create caption_with_internvl.py by @frankchieng in https://github.com/bghira/SimpleTuner/pull/778
- Add LyCORIS training to SimpleTuner by @AmericanPresidentJimmyCarter in https://github.com/bghira/SimpleTuner/pull/776
- (#782) fix type comparison in configure script by @bghira in https://github.com/bghira/SimpleTuner/pull/783
- update path in documentation by @yggdrasil75 in https://github.com/bghira/SimpleTuner/pull/784
- Add Standard as default LoRA type by @AmericanPresidentJimmyCarter in https://github.com/bghira/SimpleTuner/pull/787
- Lora init from file by @kaibioinfo in https://github.com/bghira/SimpleTuner/pull/789
- wip: optimi by @bghira in https://github.com/bghira/SimpleTuner/pull/785
- add new lora option for context+ffs by @kaibioinfo in https://github.com/bghira/SimpleTuner/pull/795
- add auto-weighting for dataset selection with user probability modulation by @bghira in https://github.com/bghira/SimpleTuner/pull/797
- fix for sampling population smaller than request by @bghira in https://github.com/bghira/SimpleTuner/pull/802
- fix instance prompt sampling multiple prompts always taking the first… by @bghira in https://github.com/bghira/SimpleTuner/pull/801
- Fixes for --flux_fast_schedule by @mhirki in https://github.com/bghira/SimpleTuner/pull/803
- tf32, custom transformer paths, and error log for short batch, add fixed length calc by @bghira in https://github.com/bghira/SimpleTuner/pull/805
- Add attention masking to the custom helpers/flux/transformer.py by @AmericanPresidentJimmyCarter in https://github.com/bghira/SimpleTuner/pull/806
New Contributors
- @benihime91 made their first contribution in https://github.com/bghira/SimpleTuner/pull/773
- @frankchieng made their first contribution in https://github.com/bghira/SimpleTuner/pull/778
- @yggdrasil75 made their first contribution in https://github.com/bghira/SimpleTuner/pull/784
- @kaibioinfo made their first contribution in https://github.com/bghira/SimpleTuner/pull/789
Full Changelog: https://github.com/bghira/SimpleTuner/compare/v0.9.8.2...v0.9.8.3