facebookresearch/xformers
Fork: 614 Star: 8639 (更新于 2024-11-14 12:09:46)
license: NOASSERTION
Language: Python .
Hackable and optimized Transformers building blocks, supporting a composable construction.
最后发布版本: v0.0.28.post1 ( 2024-09-13 23:52:20)
xFormers - Toolbox to Accelerate Research on Transformers
xFormers is:
- Customizable building blocks: Independent/customizable building blocks that can be used without boilerplate code. The components are domain-agnostic and xFormers is used by researchers in vision, NLP and more.
- Research first: xFormers contains bleeding-edge components, that are not yet available in mainstream libraries like PyTorch.
- Built with efficiency in mind: Because speed of iteration matters, components are as fast and memory-efficient as possible. xFormers contains its own CUDA kernels, but dispatches to other libraries when relevant.
Installing xFormers
- (RECOMMENDED, linux) Install latest stable with conda: Requires PyTorch 2.5.1 installed with conda
# (python 3.10/3.11 only)
conda install xformers -c xformers
- (RECOMMENDED, linux & win) Install latest stable with pip: Requires PyTorch 2.5.1
# [linux only] cuda 11.8 version
pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu118
# [linux only] cuda 12.1 version
pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu121
# [linux & win] cuda 12.4 version
pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu124
# [linux only] (EXPERIMENTAL) rocm 6.1 version
pip3 install -U xformers --index-url https://download.pytorch.org/whl/rocm6.1
- Development binaries:
# Use either conda or pip, same requirements as for the stable version above
conda install xformers -c xformers/label/dev
pip install --pre -U xformers
- Install from source: If you want to use with another version of PyTorch for instance (including nightly-releases)
# (Optional) Makes the build much faster
pip install ninja
# Set TORCH_CUDA_ARCH_LIST if running and building on different GPU types
pip install -v -U git+https://github.com/facebookresearch/xformers.git@main#egg=xformers
# (this can take dozens of minutes)
Benchmarks
Memory-efficient MHA Setup: A100 on f16, measured total time for a forward+backward pass
Note that this is exact attention, not an approximation, just by calling xformers.ops.memory_efficient_attention
More benchmarks
xFormers provides many components, and more benchmarks are available in BENCHMARKS.md.
(Optional) Testing the installation
This command will provide information on an xFormers installation, and what kernels are built/available:
python -m xformers.info
Using xFormers
Key Features
- Optimized building blocks, beyond PyTorch primitives
- Memory-efficient exact attention - up to 10x faster
- sparse attention
- block-sparse attention
- fused softmax
- fused linear layer
- fused layer norm
- fused dropout(activation(x+bias))
- fused SwiGLU
Install troubleshooting
- NVCC and the current CUDA runtime match. Depending on your setup, you may be able to change the CUDA runtime with
module unload cuda; module load cuda/xx.x
, possibly alsonvcc
- the version of GCC that you're using matches the current NVCC capabilities
- the
TORCH_CUDA_ARCH_LIST
env variable is set to the architectures that you want to support. A suggested setup (slow to build but comprehensive) isexport TORCH_CUDA_ARCH_LIST="6.0;6.1;6.2;7.0;7.2;7.5;8.0;8.6"
- If the build from source OOMs, it's possible to reduce the parallelism of ninja with
MAX_JOBS
(egMAX_JOBS=2
) - If you encounter
UnsatisfiableError
when installing with conda, make sure you have PyTorch installed in your conda environment, and that your setup (PyTorch version, cuda version, python version, OS) match an existing binary for xFormers
License
xFormers has a BSD-style license, as found in the LICENSE file.
Citing xFormers
If you use xFormers in your publication, please cite it by using the following BibTeX entry.
@Misc{xFormers2022,
author = {Benjamin Lefaudeux and Francisco Massa and Diana Liskovich and Wenhan Xiong and Vittorio Caggiano and Sean Naren and Min Xu and Jieru Hu and Marta Tintore and Susan Zhang and Patrick Labatut and Daniel Haziza and Luca Wehrstedt and Jeremy Reizenstein and Grigory Sizov},
title = {xFormers: A modular and hackable Transformer modelling library},
howpublished = {\url{https://github.com/facebookresearch/xformers}},
year = {2022}
}
Credits
The following repositories are used in xFormers, either in close to original form or as an inspiration:
最近版本更新:(数据更新于 2024-09-26 20:46:16)
2024-09-13 23:52:20 v0.0.28.post1
2024-09-12 23:49:39 v0.0.28
2024-07-26 23:41:46 v0.0.27.post2
2024-07-25 19:59:36 v0.0.27.post1
2024-07-10 00:35:21 v0.0.27
2024-04-29 22:40:12 v0.0.26.post1
2024-03-29 22:05:53 v0.0.25.post1
2024-01-31 16:42:11 v0.0.24
2023-12-15 20:14:51 v0.0.23.post1
2023-12-07 00:05:54 v0.0.23
facebookresearch/xformers同语言 Python最近更新仓库
2024-11-22 02:39:01 goauthentik/authentik
2024-11-22 00:03:47 comfyanonymous/ComfyUI
2024-11-21 22:06:18 rashevskyv/dbi
2024-11-21 21:09:02 xtekky/gpt4free
2024-11-21 20:03:58 ultralytics/ultralytics
2024-11-21 00:54:04 hect0x7/JMComic-Crawler-Python