v0.2.33
版本发布时间: 2023-11-22 17:26:31
lm-sys/FastChat最新发布版本:v0.2.36(2024-02-11 23:40:27)
What's Changed
- fix: Fix for OpenOrcaAdapter to return correct conversation template by @vjsrinath in https://github.com/lm-sys/FastChat/pull/2613
- Make fastchat.serve.model_worker to take debug argument by @uinone in https://github.com/lm-sys/FastChat/pull/2628
- openchat 3.5 model support by @imoneoi in https://github.com/lm-sys/FastChat/pull/2638
- xFastTransformer framework support by @a3213105 in https://github.com/lm-sys/FastChat/pull/2615
- feat: support custom models vllm serving by @congchan in https://github.com/lm-sys/FastChat/pull/2635
- kill only fastchat process by @scenaristeur in https://github.com/lm-sys/FastChat/pull/2641
- Use conv.update_last_message api in mt-bench answer generation by @merrymercy in https://github.com/lm-sys/FastChat/pull/2647
- Improve Azure OpenAI interface by @infwinston in https://github.com/lm-sys/FastChat/pull/2651
- Add required_temp support in jsonl format to support flexible temperature setting for gen_api_answer by @CodingWithTim in https://github.com/lm-sys/FastChat/pull/2653
- Pin openai version < 1 by @infwinston in https://github.com/lm-sys/FastChat/pull/2658
- Remove exclude_unset parameter by @snapshotpl in https://github.com/lm-sys/FastChat/pull/2654
- Revert "Remove exclude_unset parameter" by @merrymercy in https://github.com/lm-sys/FastChat/pull/2666
- added support for CodeGeex(2) by @peterwilli in https://github.com/lm-sys/FastChat/pull/2645
- add chatglm3 conv template support in conversation.py by @ZeyuTeng96 in https://github.com/lm-sys/FastChat/pull/2622
- UI and model change by @infwinston in https://github.com/lm-sys/FastChat/pull/2672
- train_flant5: fix typo by @Force1ess in https://github.com/lm-sys/FastChat/pull/2673
- Fix gpt template by @infwinston in https://github.com/lm-sys/FastChat/pull/2674
- Update README.md by @merrymercy in https://github.com/lm-sys/FastChat/pull/2679
- feat: support template's stop_str as list by @congchan in https://github.com/lm-sys/FastChat/pull/2678
- Update exllama_v2.md by @jm23jeffmorgan in https://github.com/lm-sys/FastChat/pull/2680
- save model under deepspeed by @MrZhengXin in https://github.com/lm-sys/FastChat/pull/2689
- Adding SSL support for model workers and huggingface worker by @lnguyen in https://github.com/lm-sys/FastChat/pull/2687
- Check the max_new_tokens <= 0 in openai api server by @zeyugao in https://github.com/lm-sys/FastChat/pull/2688
- Add Microsoft/Orca-2-7b and update model support docs by @BabyChouSr in https://github.com/lm-sys/FastChat/pull/2714
- fix tokenizer of chatglm2 by @wangshuai09 in https://github.com/lm-sys/FastChat/pull/2711
- Template for using Deepseek code models by @AmaleshV in https://github.com/lm-sys/FastChat/pull/2705
- add support for Chinese-LLaMA-Alpaca by @zollty in https://github.com/lm-sys/FastChat/pull/2700
- Make --load-8bit flag work with weights in safetensors format by @xuguodong1999 in https://github.com/lm-sys/FastChat/pull/2698
- Format code and minor bug fix by @merrymercy in https://github.com/lm-sys/FastChat/pull/2716
- Bump version to v0.2.33 by @merrymercy in https://github.com/lm-sys/FastChat/pull/2717
New Contributors
- @vjsrinath made their first contribution in https://github.com/lm-sys/FastChat/pull/2613
- @uinone made their first contribution in https://github.com/lm-sys/FastChat/pull/2628
- @a3213105 made their first contribution in https://github.com/lm-sys/FastChat/pull/2615
- @scenaristeur made their first contribution in https://github.com/lm-sys/FastChat/pull/2641
- @snapshotpl made their first contribution in https://github.com/lm-sys/FastChat/pull/2654
- @peterwilli made their first contribution in https://github.com/lm-sys/FastChat/pull/2645
- @ZeyuTeng96 made their first contribution in https://github.com/lm-sys/FastChat/pull/2622
- @Force1ess made their first contribution in https://github.com/lm-sys/FastChat/pull/2673
- @jm23jeffmorgan made their first contribution in https://github.com/lm-sys/FastChat/pull/2680
- @MrZhengXin made their first contribution in https://github.com/lm-sys/FastChat/pull/2689
- @lnguyen made their first contribution in https://github.com/lm-sys/FastChat/pull/2687
- @wangshuai09 made their first contribution in https://github.com/lm-sys/FastChat/pull/2711
- @AmaleshV made their first contribution in https://github.com/lm-sys/FastChat/pull/2705
- @zollty made their first contribution in https://github.com/lm-sys/FastChat/pull/2700
- @xuguodong1999 made their first contribution in https://github.com/lm-sys/FastChat/pull/2698
Full Changelog: https://github.com/lm-sys/FastChat/compare/v0.2.32...v0.2.33