MyGit
🚩收到GitHub仓库的更新通知

intel-analytics/ipex-llm

Fork: 1203 Star: 6013 (更新于 2024-05-08 21:23:26)

license: Apache-2.0

Language: Python .

Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Baichuan, Mixtral, Gemma, etc.) on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max). A PyTorch LLM library that seamlessly integrates with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, DeepSpeed, vLLM, FastChat, etc.

最后发布版本: v2.4.0 ( 2023-11-13 10:02:20)

官方网址 GitHub网址

✨免费申请网站SSL证书,支持多域名和泛域名,点击查看

无README.md

最近版本更新:(数据更新于 2024-05-14 22:33:02)

2023-11-13 10:02:20 v2.4.0

2023-04-24 10:17:43 v2.3.0

2023-01-19 13:18:37 v2.2.0

2022-09-28 11:06:27 v2.1.0

2022-03-09 15:47:13 v2.0.0

2021-07-09 20:20:26 v0.13.0

2021-04-21 09:53:25 v0.12.2

2021-01-05 13:55:32 v0.12.1

2021-01-05 13:52:01 v0.11.1

2019-11-05 16:50:46 v0.10.0

主题(topics):

gpu, llm, pytorch, transformers

intel-analytics/ipex-llm同语言 Python最近更新仓库

2024-05-18 22:55:03 xtekky/gpt4free

2024-05-18 19:13:53 MetaCubeX/mihomo

2024-05-18 12:28:29 VikParuchuri/marker

2024-05-18 07:02:12 bridgecrewio/checkov

2024-05-18 00:28:45 huggingface/transformers

2024-05-17 15:17:13 xorbitsai/inference