v0.0.14
版本发布时间: 2024-01-16 22:21:23
TransformerOptimus/SuperAGI最新发布版本:v0.0.14(2024-01-16 22:21:23)
:sparkles:SuperAGI v0.0.14:sparkles:
:rocket: Enhanced Local LLM Support with Multi-GPU :tada:
New Feature Highlights :star2:
⚙️ Local Large Language Model (LLM) Integration:
- SuperAGI now supports the use of local large language models, allowing users to leverage their own models seamlessly within the SuperAGI framework.
- Easily configure and integrate your preferred LLMs for enhanced customization and control over your AI agents.
⚡️ Multi-GPU Support:
- SuperAGI now provides multi-GPU support for improved performance and scalability.
How to Use
To enable Local Large Language Model (LLM) with Multi-GPU support, follow these simple steps:
-
LLM Integration:
- Add your model path in the celery and backend volumes in the
docker-compose-gpu.yml
file. - Run the command:
docker compose -f docker-compose-gpu.yml up --build
- Open
localhost:3000
in your browser. - Add a local LLM model from the model section.
- Use the added model for running your agents.
- Add your model path in the celery and backend volumes in the