v0.2.0
版本发布时间: 2024-04-25 21:54:25
adapter-hub/adapters最新发布版本:v1.0.0(2024-08-10 23:47:41)
This version is built for Hugging Face Transformers v4.39.x.
New
- Add support for QLoRA/ QAdapter training via bitsandbytes (@calpt via #663): Notebook Tutorial
- Add dropout to bottleneck adapters (@calpt via #667)
Changed
- Upgrade supported Transformers version (@lenglaender via #654; @calpt via #686)
- Deprecate Hub repo in docs (@calpt via #668)
- Switch resolving order if source not specified in load_adapter() (@calpt via #681)
Fixed
- Fix DataParallel training with adapters (@calpt via #658)
- Fix embedding Training Bug (@hSterz via #655)
- Fix fp16/ bf16 for Prefix Tuning (@calpt via #659)
- Fix Training Error with AdapterDrop and Prefix Tuning (@TimoImhof via #673)
- Fix default cache path for adapters loaded from AH repo (@calpt via #676)
- Fix skipping composition blocks in not applicable layers (@calpt via #665)
- Fix Unipelt Lora default config (@calpt via #682)
- Fix compatibility of adapters with HF Accelerate auto device-mapping (@calpt via #678)
- Use default head dropout prob if not provided by model (@calpt via #685)