v0.5.3
版本发布时间: 2024-02-29 01:01:31
hiyouga/LLaMA-Factory最新发布版本:v0.6.3(2024-04-21 23:43:07)
New features
- Support DoRA (Weight-Decomposed LoRA)
- Support QLoRA for the AWQ/AQLM quantized models, now 2-bit QLoRA is feasible
- Provide some example scripts in https://github.com/hiyouga/LLaMA-Factory/tree/main/examples
New models
- Base models
- Gemma (2B/7B)
- Instruct/Chat models
- Gemma-it (2B/7B)
Bug fix
- Add flash-attn package for Windows user by @codemayq in #2514
- Fix ppo trainer #1163 by @stephen-nju in #2525
- Support atom models by @Rayrtfr in #2531
- Support role in webui by @lungothrin in #2575
- Bump accelerate to 0.27.2 and fix #2552 by @Katehuuh in #2608
- Fix #2512 #2516 #2532 #2533 #2629