v1.35.5
版本发布时间: 2024-04-14 11:03:47
BerriAI/litellm最新发布版本:v1.44.15-stable(2024-09-04 00:07:25)
Full Changelog: https://github.com/BerriAI/litellm/compare/1.35.5.dev2...v1.35.5
Call 100+ LLMS, run /health checks on Admin UI
👉 Edit + Test @langfuse and @SlackHQ configurations on the LiteLLM UI
🛠️ UI - fix - adding azure OpenAI on admin ui
⚡️ [Fix] Load proxy models when proxy starts up
✅ [LiteLLM UI] Show Error message for 10-20s (h/t Graham Neubig for this request)
😇 QA - Added Tests for /health endpoints on Proxy
Load Test LiteLLM Proxy Results
Name | Status | Median Response Time (ms) | Average Response Time (ms) | Requests/s | Failures/s | Request Count | Failure Count | Min Response Time (ms) | Max Response Time (ms) |
---|---|---|---|---|---|---|---|---|---|
/chat/completions | Passed ✅ | 92 | 97.28560691555401 | 1.5031977156223721 | 0.0 | 450 | 0 | 86.17738300006295 | 926.6810239999472 |
/health/liveliness | Passed ✅ | 78 | 80.18245176222777 | 15.299212305667698 | 0.003340439368049716 | 4580 | 1 | 74.2219969999951 | 1033.391943999959 |
/health/readiness | Passed ✅ | 78 | 80.9458593949494 | 15.342638017452344 | 0.0 | 4593 | 0 | 74.08499699999993 | 1307.7433869999595 |
Aggregated | Passed ✅ | 78 | 81.34661585617866 | 32.14504803874242 | 0.003340439368049716 | 9623 | 1 | 74.08499699999993 | 1307.7433869999595 |
1、 load_test.html 1.59MB
2、 load_test_stats.csv 885B