leptonai/leptonai
Fork: 172 Star: 2657 (更新于 2024-11-26 15:25:05)
license: Apache-2.0
Language: Python .
A Pythonic framework to simplify AI service building
最后发布版本: 0.21.6 ( 2024-08-14 03:19:48)
Lepton AI
A Pythonic framework to simplify AI service building
Homepage • API Playground • Examples • Documentation • CLI References • Twitter • Blog
The LeptonAI Python library allows you to build an AI service from Python code with ease. Key features include:
- A Pythonic abstraction
Photon
, allowing you to convert research and modeling code into a service with a few lines of code. - Simple abstractions to launch models like those on HuggingFace in few lines of code.
- Prebuilt examples for common models such as Llama, SDXL, Whisper, and others.
- AI tailored batteries included such as autobatching, background jobs, etc.
- A client to automatically call your service like native Python functions.
- Pythonic configuration specs to be readily shipped in a cloud environment.
Getting started with one-liner
Install the library with:
pip install -U leptonai
This installs the leptonai
Python library, as well as the commandline interface lep
. You can then launch a HuggingFace model, say gpt2
, in one line of code:
lep photon runlocal --name gpt2 --model hf:gpt2
If you have access to the Llama2 model (apply for access here) and you have a reasonably sized GPU, you can launch it with:
# hint: you can also write `-n` and `-m` for short
lep photon runlocal -n llama2 -m hf:meta-llama/Llama-2-7b-chat-hf
(Be sure to use the -hf
version for Llama2, which is compatible with huggingface pipelines.)
You can then access the service with:
from leptonai.client import Client, local
c = Client(local(port=8080))
# Use the following to print the doc
print(c.run.__doc__)
print(c.run(inputs="I enjoy walking with my cute dog"))
Fully managed Llama2 models and CodeLlama models can be found in the playground.
Many standard HuggingFace pipelines are supported - find out more details in the documentation. Not all HuggingFace models are supported though, as many of them contain custom code and are not standard pipelines. If you find a popular model you would like to support, please open an issue or a PR.
Checking out more examples
You can find out more examples from the examples repository. For example, launch the Stable Diffusion XL model with:
git clone git@github.com:leptonai/examples.git
cd examples
lep photon runlocal -n sdxl -m advanced/sdxl/sdxl.py
Once the service is running, you can access it with:
from leptonai.client import Client, local
c = Client(local(port=8080))
img_content = c.run(prompt="a cat launching rocket", seed=1234)
with open("cat.png", "wb") as fid:
fid.write(img_content)
or access the mounted Gradio UI at http://localhost:8080/ui. Check the README file for more details.
A fully managed SDXL is hosted at https://dashboard.lepton.ai/playground/sdxl with API access.
Writing your own photons
Writing your own photon is simple: write a Python Photon class and decorate functions with @Photon.handler
. As long as your input and output are JSON serializable, you are good to go. For example, the following code launches a simple echo service:
# my_photon.py
from leptonai.photon import Photon
class Echo(Photon):
@Photon.handler
def echo(self, inputs: str) -> str:
"""
A simple example to return the original input.
"""
return inputs
You can then launch the service with:
lep photon runlocal -n echo -m my_photon.py
Then, you can use your service as follows:
from leptonai.client import Client, local
c = Client(local(port=8080))
# will print available paths
print(c.paths())
# will print the doc for c.echo. You can also use `c.echo?` in Jupyter.
print(c.echo.__doc__)
# will actually call echo.
c.echo(inputs="hello world")
For more details, checkout the documentation and the examples.
Contributing
Contributions and collaborations are welcome and highly appreciated. Please check out the contributor guide for how to get involved.
License
The Lepton AI Python library is released under the Apache 2.0 license.
Developer Note: early development of LeptonAI was in a separate mono-repo, which is why you may see commits from the leptonai/lepton
repo. We intend to use this open source repo as the source of truth going forward.
最近版本更新:(数据更新于 2024-09-10 16:45:52)
2024-08-14 03:19:48 0.21.6
2024-08-10 02:36:56 0.21.5
2024-07-31 04:44:19 0.21.4
2024-07-16 10:12:34 0.21.3
2024-07-16 05:17:28 0.21.2
2024-06-18 04:23:37 0.21.1
2024-06-12 22:36:13 0.21.0
2024-06-05 10:07:13 0.20.4
2024-06-03 17:05:57 0.20.3
2024-05-31 08:46:20 0.20.2
主题(topics):
artificial-intelligence, cloud, deep-learning, gpu, machine-learning, python
leptonai/leptonai同语言 Python最近更新仓库
2024-12-22 09:03:32 ultralytics/ultralytics
2024-12-21 13:26:40 notepad-plus-plus/nppPluginList
2024-12-21 11:42:53 XiaoMi/ha_xiaomi_home
2024-12-21 04:33:22 comfyanonymous/ComfyUI
2024-12-20 18:47:56 home-assistant/core
2024-12-20 15:41:40 jxxghp/MoviePilot