openchatai/OpenChat
Fork: 642 Star: 5191 (更新于 2024-11-05 10:35:04)
license: MIT
Language: JavaScript .
LLMs custom-chatbots console ⚡
最后发布版本: 0.3.7-beta ( 2024-02-18 17:36:03)
🔥 OpenChat
OpenChat is an everyday user chatbot console that simplifies the utilization of large language models. With the advancements in AI, the installation and usage of these models have become overwhelming. OpenChat aims to address this challenge by providing a two-step setup process to create a comprehensive chatbot console. It serves as a central hub for managing multiple customized chatbots.
Currently, OpenChat supports GPT models, and we are actively working on incorporating various open-source drivers that can be activated with a single click.
Try it out:
You can try it out on openchat.so
https://github.com/openchatai/OpenChat/assets/32633162/112a72a7-4314-474b-b7b5-91228558370c
Chinese Video Tutorial:https://www.bilibili.com/video/BV1YX4y1H7oN
🏁 Current Features
- Create unlimited local chatbots based on GPT-3 (and GPT-4 if available).
- Customize your chatbots by providing PDF files, websites, and soon, integrations with platforms like Notion, Confluence, and Office 365.
- Each chatbot has unlimited memory capacity, enabling seamless interaction with large files such as a 400-page PDF.
- Embed chatbots as widgets on your website or internal company tools.
- Use your entire codebase as a data source for your chatbots (pair programming mode).
- And much more!
🛣️ Roadmap:
- Create unlimited chatbots
- Share chatbots via URL
- Integrate chatbots on any website using JS (as a widget on the bottom right corner)
- Support GPT-3 models
- Support vector database to provide chatbots with larger memory
- Accept websites as a data source
- Accept PDF files as a data source
- Support multiple data sources per chatbot
- Support ingesting an entire codebase using GitHub API and use it as a data source with pair programming mode
- Support pre-defined messages with a single click
- Support offline vector DB
- Re write the backend in Python Django
- In progress: re-write the frontend in Next.js & TS
- Support Slack integration (allow users to connect chatbots with their Slack workspaces)
- Support Intercom integration (enable users to sync chat conversations with Intercom)
- Support offline open-source models (e.g., Alpaca, LLM drivers)
- Support Vertex AI and Palm as LLMs
- Support Confluence, Notion, Office 365, and Google Workspace
- Refactor the codebase to be API ready
- Create a new UI designer for website-embedded chatbots
- Support custom input fields for chatbots
- Support offline usage: this is a major feature, OpenChat will operate fully offline with no internet connection at this stage (offline LLMs, offline Vector DBs)
We love hearing from you! Got any cool ideas or requests? We're all ears! So, if you have something in mind, give us a shout!
🚀 Getting Started
-
Make sure you have docker installed.
-
To begin, clone this Git repository:
git clone git@github.com:openchatai/OpenChat.git
Setting Up Your Environment
Note: Starting July, Qdrant is our Preferred Open-Source Vector Store 🚀 No initial Pinecone registration required. To begin, delve into the comprehensive guide: Using Qdrant, provided in the following section.
Before you begin, make sure to update the common.env
file with the necessary keys:
OPENAI_API_KEY=# Retrieve from your [openai.com](https://www.openai.com) account
PINECONE_API_KEY=# Obtain from the "API Keys" tab in [pinecone](https://www.pinecone.io)
PINECONE_ENVIRONMENT=# Obtain after creating your index in [pinecone](https://www.pinecone.io)
VECTOR_STORE_INDEX_NAME=# Obtain after creating your index in [pinecone](https://www.pinecone.io)
STORE=pinecone
Using Azure OpenAI
-
USE_AZURE_OPENAI=true
: Whether to use the Azure OpenAI API. -
AZURE_OPENAI_API_KEY
: Your Azure OpenAI API key. -
AZURE_OPENAI_API_INSTANCE_NAME
: The name of your Azure OpenAI API instance. -
AZURE_OPENAI_API_COMPLETIONS_DEPLOYMENT_NAME
: The name of the Azure OpenAI API deployment for completions. -
AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME
: The name of the Azure OpenAI API deployment for embeddings.
Using Qdrant
If you want to switch from Pinecone to Qdrant, you can set the following environment variables:
-
OPENAI_API_KEY
= Your open ai key -
QDRANT_URL
: The URL of the Qdrant server. -
STORE
: The store to use to store embeddings. Can beqdrant
orpinecone
.
Optional [To modify the chat behaviour]
CHAIN_TYPE
= The type of chain to use: conversation_retrieval
| retrieval_qa
-
retrieval_qa
-> Learn more -
conversation_retrieval
-> Learn more
Using Prebuilt Images
If you're experiencing slow internet speeds or if Docker builds are taking a long time, consider using the prebuilt images for your respective architecture. Simply comment out the unnecessary image line in the docker-compose.yml
file and uncomment the appropriate prebuilt image line.
Example:
# Mac environment
image: codebanesr/openchat_llm_server:edge_amd64
# Or, for Linux environment
image: codebanesr/openchat_llm_server:edge
Note: for pincone db, make sure that the dimension is equal to 1536
- Navigate to the repository folder and run the following command (for MacOS or Linux):
make install
or in case you are using Windows
make.bat
Sure, here's the modified text with the additional line you requested:
Getting Started with the Openchat Django App
Start your adventure of contributing to and using OpenChat, now remade using the Python programming language. You can begin by following the instructions in the guide available here: OpenChat Python Guide.
Kindly be aware that the transition to the Python backend includes a significant alteration related to the Qdrant vector store, constituting a breaking change.
Once the installation is complete, you can access the OpenChat console at: http://localhost:8000
🚀 Unleash the Power of Native LLM
Discover the latest addition: llama2 support. Dive into this Guide to Harness LLAMA2 by Meta 📖🔮
Full documentation available here
🚀 Upgrade guide:
We do our best to not introduce breaking changes, so far, you only need to git pull and run make install
whenever there is a new update.
❤️ Thanks:
- To @mayooear for his work and tutorial on chatting with PDF files, we utilized a lot of his code in the LLM server.
License
This project is licensed under the MIT License.
Contributors ✨
Thanks goes to these wonderful people (emoji key):
Ikko Eltociear Ashimine 🤔 💻 |
Joshua Sindy 🐛 |
Erjan Kalybek 📖 |
WoahAI 🐛 💻 |
Tommy in Tongji 📖 |
codebane 💻 📖 |
lvalics 💻 📖 |
This project follows the all-contributors specification. Contributions of any kind welcome!
最近版本更新:(数据更新于 2024-10-02 10:06:07)
2024-02-18 17:36:03 0.3.7-beta
2024-02-18 00:34:32 v0.3.6
2023-11-16 04:49:45 v0.3.5-beta
2023-08-21 00:07:21 0.33
2023-08-14 04:19:15 0.3.3
2023-07-23 05:10:25 0.32
2023-07-10 01:57:43 0.31
2023-06-20 01:03:58 0.3.0
2023-06-04 20:20:20 0.2.0
2023-05-30 23:23:26 0.1.0
openchatai/OpenChat同语言 JavaScript最近更新仓库
2024-11-21 23:03:24 bia-pain-bache/BPB-Worker-Panel
2024-11-21 22:46:48 MHSanaei/3x-ui
2024-11-21 07:00:59 nodejs/node
2024-11-21 00:49:46 FortAwesome/Font-Awesome
2024-11-18 14:26:02 projectdiscovery/nuclei-templates
2024-11-17 14:42:23 koodo-reader/koodo-reader