MyGit
🚩收到GitHub仓库的更新通知

nilsherzig/LLocalSearch

Fork: 285 Star: 4814 (更新于 2024-04-25 10:41:33)

license: Apache-2.0

Language: Svelte .

LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.

最后发布版本: v0.2 ( 2024-04-13 20:52:31)

GitHub网址

✨免费申请网站SSL证书,支持多域名和泛域名,点击查看

LLocalSearch

[!IMPORTANT] Discuss configurations and setups with other users at: https://discord.gg/Cm77Eav5mX. Help / Support is handled exclusively on GitHub to allow people with similar issues to find solutions more easily.

What it is

LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.

Demo

Screencast from 2024-04-21 22-16-23.webm

Features

  • 🕵️ Completely local (no need for API keys)
  • 💸 Runs on "low end" LLM Hardware (demo video uses a 7b model)
  • 🤓 Progress logs, allowing for a better understanding of the search process
  • 🤔 Follow-up questions
  • 📱 Mobile friendly interface
  • 🚀 Fast and easy to deploy with Docker Compose
  • 🌐 Web interface, allowing for easy access from any device
  • 💮 Handcrafted UI with light and dark mode

Status

This project is still in its very early days. Expect some bugs.

How it works

Please read infra to get the most up-to-date idea.

Install

Requirements

  • A running Ollama server, reachable from the container
    • GPU is not needed, but recommended
  • Docker Compose

[!WARNING] Please read Ollama Setup Guide to get Ollama working with LLocalSearch.

Run the latest release

Recommended, if you don't intend to develop on this project.

git clone https://github.com/nilsherzig/LLocalSearch.git
cd ./LLocalSearch
# 🔴 check the env vars inside the compose file (and `env-example` file) and change them if needed
docker-compose up 

🎉 You should now be able to open the web interface on http://localhost:3000. Nothing else is exposed by default.

Run the development version

Only recommended if you want to contribute to this project.

git clone https://github.com/nilsherzig/LLocalsearch.git
# 1. make sure to check the env vars inside the `docker-compose.dev.yaml`.
# 2. Make sure you've really checked the dev compose file not the normal one.

# 3. build the containers and start the services
make dev 
# Both front and backend will hot reload on code changes. 

If you don't have make installed, you can run the commands inside the Makefile manually.

Now you should be able to access the frontend on http://localhost:3000.

最近版本更新:(数据更新于 2024-04-25 10:41:17)

2024-04-13 20:52:31 v0.2

2024-04-04 06:44:59 v0.1

主题(topics):

llm, search-engine

nilsherzig/LLocalSearch同语言 Svelte最近更新仓库

2024-05-03 12:44:14 saadeghi/daisyui

2024-05-03 01:47:19 windmill-labs/windmill

2024-04-26 20:20:58 goniszewski/grimoire

2024-04-26 14:42:04 hcengineering/platform

2024-04-21 08:41:14 open-webui/open-webui

2024-04-11 06:55:33 huntabyte/shadcn-svelte