adapters3.1.0
版本发布时间: 2022-09-15 17:39:42
adapter-hub/adapters最新发布版本:v1.0.0(2024-08-10 23:47:41)
Based on transformers v4.21.3
New
New adapter methods
- Add LoRA implementation (@calpt via #334, #399): Documentation
- Add (IA)^3 implementation (@calpt via #396): Documentation
- Add UniPELT implementation (@calpt via #407): Documentation
New model integrations
- Add
Deberta
andDebertaV2
integration(@hSterz via #340) - Add Vision Transformer integration (@calpt via #363)
Misc
- Add
adapter_summary()
method (@calpt via #371): More info - Return AdapterFusion attentions using
output_adapter_fusion_attentions
argument (@calpt via #417): Documentation
Changed
- Upgrade of underlying transformers version (@calpt via #344, #368, #404)
Fixed
- Infer label names for training for flex head models (@calpt via #367)
- Ensure root dir exists when saving all adapters/heads/fusions (@calpt via #375)
- Avoid attempting to set prediction head if non-existent (@calpt via #377)
- Fix T5EncoderModel adapter integration (@calpt via #376)
- Fix loading adapters together with full model (@calpt via #378)
- Multi-gpu support for prefix-tuning (@alexanderhanboli via #359)
- Fix issues with embedding training (@calpt via #386)
- Fix initialization of added embeddings (@calpt via #402)
- Fix model serialization using
torch.save()
&torch.load()
(@calpt via #406)