langfuse/langfuse
Fork: 300 Star: 3344 (更新于 2024-04-29 01:46:04)
license: NOASSERTION
Language: TypeScript .
🪢 Open source LLM engineering platform: Observability, metrics, evals, prompt management, playground, datasets. 🍊YC W23
最后发布版本: v2.36.0 ( 2024-04-28 18:10:07)
Overview
Unmute video for voice-over
https://github.com/langfuse/langfuse/assets/2834609/a94062e9-c782-4ee9-af59-dee6370149a8
Develop
- Observability: Instrument your app and start ingesting traces to Langfuse (Quickstart, Integrations Tracing)
- Langfuse UI: Inspect and debug complex logs (Demo, Tracing)
- Prompt Management: Manage, version and deploy prompts from within Langfuse (Prompt Management)
- Prompt Engineering: Test and iterate on your prompts with the LLM Playground
Monitor
- Analytics: Track metrics (cost, latency, quality) and gain insights from dashboards & data exports (Analytics)
-
Evals: Collect and calculate scores for your LLM completions (Scores & Evaluations)
- Run model-based evaluations (Model-based evaluations) within Langfuse
- Collect user feedback (User Feedback)
- Manually score observations in Langfuse (Manual Scores)
Test
-
Experiments: Track and test app behaviour before deploying a new version
- Datasets let you test expected in and output pairs and benchmark performance before deploying (Datasets)
- Track versions and releases in your application (Experimentation, Prompt Management)
Get started
Langfuse Cloud
Managed deployment by the Langfuse team, generous free-tier (hobby plan), no credit card required.
Localhost (docker)
# Clone repository
git clone https://github.com/langfuse/langfuse.git
cd langfuse
# Run server and database
docker compose up -d
→ Learn more about deploying locally
Self-host (docker)
Langfuse is simple to self-host and keep updated. It currently requires only a single docker container. → Self Hosting Instructions
Templated deployments: Railway, GCP Cloud Run, AWS Fargate, Kubernetes and others
Get Started
API Keys
You need a Langfuse public and secret key to get started. Sign up here and find them in your project settings.
Ingesting Data · Instrumenting Your Application
Note: We recommend using our fully async, typed SDKs that allow you to instrument any LLM application with any underlying model. They are available in Python (Decorators) & JS/TS. The SDKs will always be the most fully featured and stable way to ingest data into Langfuse.
You may want to use another integration to get started quickly or implement a use case that we do not yet support. However, we recommend to migrate to the Langfuse SDKs over time to ensure performance and stability.
See the → Quickstart to integrate Langfuse.
Integrations
Integration | Supports | Description |
---|---|---|
SDK - recommended | Python, JS/TS | Manual instrumentation using the SDKs for full flexibility. |
OpenAI SDK | Python, JS/TS | Automated instrumentation of OpenAI SDK. |
Langchain | Python, JS/TS | Instrumentation via Langchain callbacks. |
LlamaIndex | Python | Automated instrumentation via LlamaIndex callback system. |
API | Directly call the public API. OpenAPI spec available. |
External projects/packages that integrate with Langfuse:
Name | Description |
---|---|
LiteLLM | Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs). |
Flowise | JS/TS no-code builder for customized LLM flows. |
Langflow | Python-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows. |
Superagent | Open Source AI Assistant Framework & API for prototyping and deployment of agents. |
Questions and feedback
Ideas and roadmap
Support and feedback
In order of preference the best way to communicate with us:
- GitHub Discussions: Contribute ideas support requests and report bugs (preferred as we create a permanent, indexed artifact for other community members)
- Discord: community support
- Privately: contact at langfuse dot com
Contributing to Langfuse
- Vote on Ideas
- Raise and comment on Issues
- Open a PR - see CONTRIBUTING.md for details on how to setup a development environment.
License
This repository is MIT licensed, except for the ee
folders. See LICENSE and docs for more details.
Misc
GET API to export your data
GET routes to use data in downstream applications (e.g. embedded analytics).
Security & Privacy
We take data security and privacy seriously. Please refer to our Security and Privacy page for more information.
Telemetry
By default, Langfuse automatically reports basic usage statistics of self-hosted instances to a centralized server (PostHog).
This helps us to:
- Understand how Langfuse is used and improve the most relevant features.
- Track overall usage for internal and external (e.g. fundraising) reporting.
None of the data is shared with third parties and does not include any sensitive information. We want to be super transparent about this and you can find the exact data we collect here.
You can opt-out by setting TELEMETRY_ENABLED=false
.
最近版本更新:(数据更新于 2024-04-29 01:45:48)
2024-04-28 18:10:07 v2.36.0
2024-04-27 20:27:15 v2.35.0
2024-04-26 16:37:03 v2.34.0
2024-04-24 22:41:40 v2.32.0
2024-04-24 22:39:41 v2.33.0
2024-04-23 16:15:59 v2.31.0
2024-04-20 00:05:35 v2.30.2
2024-04-19 07:07:26 v2.30.1
2024-04-19 00:26:25 v2.30.0
2024-04-17 23:08:54 v2.29.2
主题(topics):
analytics, evals, evaluation, gpt, langchain, large-language-models, llama-index, llm, llm-evaluation, llmops, monitoring, observability, open-source, openai, playground, prompt-engineering, prompt-management, self-hosted, ycombinator
langfuse/langfuse同语言 TypeScript最近更新仓库
2024-05-10 15:20:38 pure-admin/pure-admin-thin
2024-05-10 14:37:52 toeverything/AFFiNE
2024-05-10 12:14:51 labring/FastGPT
2024-05-09 22:15:19 lobehub/lobe-chat
2024-05-09 21:51:32 honojs/hono
2024-05-09 18:26:00 illacloud/illa-builder