v0.2.0
版本发布时间: 2024-06-02 13:03:15
open-webui/open-webui最新发布版本:v0.3.28(2024-09-25 00:52:38)
Release Note
📢 We are thrilled to announce that our project has been selected as one of the 11 projects shaping open-source AI by the GitHub Accelerator program! 🎉
This incredible milestone would not have been possible without the relentless support and contributions from our amazing community. Each bug report, feature suggestion, code contribution, and word of encouragement has played a vital role in our journey.
We are honored and revitalized by this recognition and are more committed than ever to pushing the boundaries of what open-source AI can achieve.
We can't wait to see what the future holds and continue this journey together. Thank you all for believing in and supporting us!
🚀 Onwards and upwards!
Sincerely, Open WebUI Team
[0.2.0] - 2024-06-01
Added
- 🔧 Pipelines Support: Open WebUI now includes a plugin framework for enhanced customization and functionality (https://github.com/open-webui/pipelines). Easily add custom logic and integrate Python libraries, from AI agents to home automation APIs.
- 🔗 Function Calling via Pipelines: Integrate function calling seamlessly through Pipelines.
- ⚖️ User Rate Limiting via Pipelines: Implement user-specific rate limits to manage API usage efficiently.
- 📊 Usage Monitoring with Langfuse: Track and analyze usage statistics with Langfuse integration through Pipelines.
- 🕒 Conversation Turn Limits: Set limits on conversation turns to manage interactions better through Pipelines.
- 🛡️ Toxic Message Filtering: Automatically filter out toxic messages to maintain a safe environment using Pipelines.
- 🔍 Web Search Support: Introducing built-in web search capabilities via RAG API, allowing users to search using SearXNG, Google Programmatic Search Engine, Brave Search, serpstack, and serper. Activate it effortlessly by adding necessary variables from Document settings > Web Params.
- 🗂️ Models Workspace: Create and manage model presets for both Ollama/OpenAI API. Note: The old Modelfiles workspace is deprecated.
- 🛠️ Model Builder Feature: Build and edit all models with persistent builder mode.
- 🏷️ Model Tagging Support: Organize models with tagging features in the models workspace.
- 📋 Model Ordering Support: Effortlessly organize models by dragging and dropping them into the desired positions within the models workspace.
- 📈 OpenAI Generation Stats: Access detailed generation statistics for OpenAI models.
- 📅 System Prompt Variables: New variables added: '{{CURRENT_DATE}}' and '{{USER_NAME}}' for dynamic prompts.
- 📢 Global Banner Support: Manage global banners from admin settings > banners.
- 🗃️ Enhanced Archived Chats Modal: Search and export archived chats easily.
- 📂 Archive All Button: Quickly archive all chats from settings > chats.
- 🌐 Improved Translations: Added and improved translations for French, Croatian, Cebuano, and Vietnamese.
Fixed
- 🔍 Archived Chats Visibility: Resolved issue with archived chats not showing in the admin panel.
- 💬 Message Styling: Fixed styling issues affecting message appearance.
- 🔗 Shared Chat Responses: Corrected the issue where shared chat response messages were not readonly.
- 🖥️ UI Enhancement: Fixed the scrollbar overlapping issue with the message box in the user interface.
Changed
- 💾 User Settings Storage: User settings are now saved on the backend, ensuring consistency across all devices.
- 📡 Unified API Requests: The API request for getting models is now unified to '/api/models' for easier usage.
- 🔄 Versioning Update: Our versioning will now follow the format 0.x for major updates and 0.x.y for patches.
- 📦 Export All Chats (All Users): Moved this functionality to the Admin Panel settings for better organization and accessibility.
Removed
- 🚫 Bundled LiteLLM Support Deprecated: Migrate your LiteLLM config.yaml to a self-hosted LiteLLM instance. LiteLLM can still be added via OpenAI Connections. Download the LiteLLM config.yaml from admin settings > database > export LiteLLM config.yaml.
👏 Massive thanks to our incredible contributors for their hard work and dedication to making this release possible: @jasinliu, @sime2408, @cheahjs, @KodeurKubik, @que-nguyen, @aguvener, @silentoplayz, @arkohut, @reiebrole30, @not-nullptr
🚀 We'd like to extend a heartfelt thank you to our amazing sponsors for their generous support (Note: We've excluded private sponsors from this list. If you'd like to get featured here, feel free to reach out to us!): @github, @lukepiette, @roosi-gmbh, @rfernandez760, @kroonen, @GenieDev101, @Lance1101, @awaliuddin