v1.0.0
版本发布时间: 2024-08-10 23:47:41
adapter-hub/adapters最新发布版本:v1.0.0(2024-08-10 23:47:41)
Blog post: https://adapterhub.ml/blog/2024/08/adapters-update-reft-qlora-merging-models
This version is built for Hugging Face Transformers v4.43.x.
New Adapter Methods & Model Support
- Add Representation Fine-Tuning (ReFT) implementation (LoReFT, NoReFT, DiReFT) (@calpt via #705)
- Add LoRA weight merging with Task Arithmetics (@lenglaender via #698)
- Add Whisper model support + notebook (@TimoImhof via #693; @julian-fong via #717)
- Add Mistral model support (@KorventennFR via #609)
- Add PLBart model support (@FahadEbrahim via #709)
Breaking Changes & Deprecations
- Remove support for loading from archived Hub repository (@calpt via #724)
- Remove deprecated add_fusion() & train_fusion() methods (@calpt via #714)
- Remove deprecated arguments in
push_adapter_to_hub()
method (@calpt via #724) - Deprecate support for passing Python lists to adapter activation (@calpt via #714)
Minor Fixes & Changes
- Upgrade supported Transformers version (@calpt & @lenglaender via #712, #719, #727)
- Fix SDPA/ Flash attention support for Llama (@calpt via #722)
- Fix gradient checkpointing for Llama and for Bottleneck adapters (@calpt via #730)