v2.12.1
版本发布时间: 2024-04-09 21:46:52
mudler/LocalAI最新发布版本:v2.23.0(2024-11-11 01:07:39)
I'm happy to announce the v2.12.1 LocalAI release is out!
🌠 Landing page and Swagger
Ever wondered what to do after LocalAI is up and running? Integration with a simple web interface has been started, and you can see now a landing page when hitting the LocalAI front page:
You can also now enjoy Swagger to try out the API calls directly:
🌈 AIO images changes
Now the default model for CPU images is https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B-GGUF - pre-configured for functions and tools API support! If you are an Intel-GPU owner, the Intel profile for AIO images is now available too!
:rocket: OpenVINO and transformers enhancements
Now there is support for OpenVINO and transformers got token streaming support now thanks to @fakezeta!
To try OpenVINO, you can use the example available in the documentation: https://localai.io/features/text-generation/#examples
🎈 Lot of small improvements behind the scenes!
Thanks for our outstanding community, we have enhanced several areas:
- The build time of LocalAI was speed up significantly! thanks to @cryptk for the efforts in enhancing the build system
- @thiner worked hardly to get Vision support for AutoGPTQ
- ... and much more! see down below for a full list, be sure to star LocalAI and give it a try!
📣 Spread the word!
First off, a massive thank you (again!) to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say!
And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community.
Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using @LocalAI_OSS and @mudler_it or joining our sponsors can make a big difference.
Also, if you haven't yet joined our Discord, come on over! Here's the link: https://discord.gg/uJAeKSAGDy
Every bit of support, every mention, and every star adds up and helps us keep this ship sailing. Let's keep making LocalAI awesome together!
Thanks a ton, and here's to more exciting times ahead with LocalAI!
What's Changed
Bug fixes :bug:
- fix: downgrade torch by @mudler in https://github.com/mudler/LocalAI/pull/1902
- fix(aio): correctly detect intel systems by @mudler in https://github.com/mudler/LocalAI/pull/1931
- fix(swagger): do not specify a host by @mudler in https://github.com/mudler/LocalAI/pull/1930
- fix(tools): correctly render tools response in templates by @mudler in https://github.com/mudler/LocalAI/pull/1932
- fix(grammar): respect JSONmode and grammar from user input by @mudler in https://github.com/mudler/LocalAI/pull/1935
- fix(hermes-2-pro-mistral): add stopword for toolcall by @mudler in https://github.com/mudler/LocalAI/pull/1939
- fix(functions): respect when selected from string by @mudler in https://github.com/mudler/LocalAI/pull/1940
- fix: use exec in entrypoint scripts to fix signal handling by @cryptk in https://github.com/mudler/LocalAI/pull/1943
- fix(hermes-2-pro-mistral): correct stopwords by @mudler in https://github.com/mudler/LocalAI/pull/1947
- fix(welcome): stable model list by @mudler in https://github.com/mudler/LocalAI/pull/1949
- fix(ci): manually tag latest images by @mudler in https://github.com/mudler/LocalAI/pull/1948
- fix(seed): generate random seed per-request if -1 is set by @mudler in https://github.com/mudler/LocalAI/pull/1952
- fix regression #1971 by @fakezeta in https://github.com/mudler/LocalAI/pull/1972
Exciting New Features 🎉
- feat(aio): add intel profile by @mudler in https://github.com/mudler/LocalAI/pull/1901
- Enhance autogptq backend to support VL models by @thiner in https://github.com/mudler/LocalAI/pull/1860
- feat(assistant): Assistant and AssistantFiles api by @christ66 in https://github.com/mudler/LocalAI/pull/1803
- feat: Openvino runtime for transformer backend and streaming support for Openvino and CUDA by @fakezeta in https://github.com/mudler/LocalAI/pull/1892
- feat: Token Stream support for Transformer, fix: missing package for OpenVINO by @fakezeta in https://github.com/mudler/LocalAI/pull/1908
- feat(welcome): add simple welcome page by @mudler in https://github.com/mudler/LocalAI/pull/1912
- fix(build): better CI logging and correct some build failure modes in Makefile by @cryptk in https://github.com/mudler/LocalAI/pull/1899
- feat(webui): add partials, show backends associated to models by @mudler in https://github.com/mudler/LocalAI/pull/1922
- feat(swagger): Add swagger API doc by @mudler in https://github.com/mudler/LocalAI/pull/1926
- feat(build): adjust number of parallel make jobs by @cryptk in https://github.com/mudler/LocalAI/pull/1915
- feat(swagger): update by @mudler in https://github.com/mudler/LocalAI/pull/1929
- feat: first pass at improving logging by @cryptk in https://github.com/mudler/LocalAI/pull/1956
- fix(llama.cpp): set better defaults for llama.cpp by @mudler in https://github.com/mudler/LocalAI/pull/1961
📖 Documentation and examples
- docs(aio-usage): update docs to show examples by @mudler in https://github.com/mudler/LocalAI/pull/1921
👒 Dependencies
- :arrow_up: Update docs version mudler/LocalAI by @localai-bot in https://github.com/mudler/LocalAI/pull/1903
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1904
- :arrow_up: Update M0Rf30/go-tiny-dream by @M0Rf30 in https://github.com/mudler/LocalAI/pull/1911
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1913
- :arrow_up: Update ggerganov/whisper.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1914
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1923
- :arrow_up: Update ggerganov/whisper.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1924
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1928
- :arrow_up: Update ggerganov/whisper.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1933
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1934
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1937
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1941
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1953
- :arrow_up: Update ggerganov/whisper.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1958
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1959
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1964
Other Changes
- :arrow_up: Update ggerganov/whisper.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1927
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1960
- fix(hermes-2-pro-mistral): correct dashes in template to suppress newlines by @mudler in https://github.com/mudler/LocalAI/pull/1966
- :arrow_up: Update ggerganov/whisper.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1969
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1970
- :arrow_up: Update ggerganov/llama.cpp by @localai-bot in https://github.com/mudler/LocalAI/pull/1973
New Contributors
- @thiner made their first contribution in https://github.com/mudler/LocalAI/pull/1860
Full Changelog: https://github.com/mudler/LocalAI/compare/v2.11.0...v2.12.1
1、 local-ai-avx-Darwin-x86_64 169.91MB
2、 local-ai-avx-Linux-x86_64 199.32MB
3、 local-ai-avx2-Darwin-x86_64 169.91MB
4、 local-ai-avx2-Linux-x86_64 199.33MB
5、 local-ai-avx512-Darwin-x86_64 169.98MB
6、 local-ai-avx512-Linux-x86_64 199.36MB
7、 local-ai-cuda11-Linux-x86_64 225.39MB
8、 local-ai-cuda12-Linux-x86_64 225.68MB
9、 stablediffusion 12.68MB