v0.1.16
版本发布时间: 2024-03-29 18:32:38
InternLM/xtuner最新发布版本:v0.1.23(2024-07-22 20:19:23)
What's Changed
- set dev version by @LZHgrla in https://github.com/InternLM/xtuner/pull/487
- Fix type error when the visual encoder is not CLIP by @hhaAndroid in https://github.com/InternLM/xtuner/pull/496
- [Feature] Support Sequence parallel by @HIT-cwh in https://github.com/InternLM/xtuner/pull/456
- [Bug] Fix bugs in flash_attn1_pytorch by @HIT-cwh in https://github.com/InternLM/xtuner/pull/513
- [Fix] delete cat in varlen attn by @HIT-cwh in https://github.com/InternLM/xtuner/pull/508
- bump version to 0.1.16 by @HIT-cwh in https://github.com/InternLM/xtuner/pull/520
- [Improve] Add
generation_kwargs
forEvaluateChatHook
by @LZHgrla in https://github.com/InternLM/xtuner/pull/501 - [Bugs] Fix bugs when training in non-distributed env by @HIT-cwh in https://github.com/InternLM/xtuner/pull/522
- [Fix] Support transformers>=4.38 and require transformers>=4.36.0 by @HIT-cwh in https://github.com/InternLM/xtuner/pull/494
- [Fix] Fix throughput hook by @HIT-cwh in https://github.com/InternLM/xtuner/pull/527
- Update README.md by @JianxinDong in https://github.com/InternLM/xtuner/pull/528
- [Fix] dispatch internlm rote by @HIT-cwh in https://github.com/InternLM/xtuner/pull/530
- Limit transformers != 4.38 by @HIT-cwh in https://github.com/InternLM/xtuner/pull/531
New Contributors
- @hhaAndroid made their first contribution in https://github.com/InternLM/xtuner/pull/496
- @JianxinDong made their first contribution in https://github.com/InternLM/xtuner/pull/528
Full Changelog: https://github.com/InternLM/xtuner/compare/v0.1.15...v0.1.16