From efdca026381a13319be082f079c60275cc871301 Mon Sep 17 00:00:00 2001 From: dan_the_3rd <43445237+danthe3rd@users.noreply.github.com> Date: Wed, 16 Nov 2022 07:22:43 -0500 Subject: [PATCH] Improve installation doc with conda (#529) Closes #390. Co-authored-by: danthe3rd --- README.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index 320eb4e36f..c5d12401f5 100644 --- a/README.md +++ b/README.md @@ -28,7 +28,7 @@ xFormers is: ## Installing xFormers -* **(RECOMMENDED) Using binaries**: We provide binaries for Linux and recent PyTorch versions. Install xFormers with conda: +* **(RECOMMENDED) Using binaries**: We provide binaries for Linux and recent PyTorch versions. After you have [installed pytorch in conda](https://pytorch.org/get-started/locally/), install xFormers with conda: ```bash conda install xformers -c xformers/label/dev @@ -219,6 +219,7 @@ Patrick et al., 2021](https://arxiv.org/abs/2106.05392)* * the version of GCC that you're using matches the current NVCC capabilities * the `TORCH_CUDA_ARCH_LIST` env variable is set to the architures that you want to support. A suggested setup (slow to build but comprehensive) is `export TORCH_CUDA_ARCH_LIST="6.0;6.1;6.2;7.0;7.2;7.5;8.0;8.6"` * If the build from source OOMs, it's possible to reduce the parallelism of ninja with `MAX_JOBS` (eg `MAX_JOBS=2`) +* If you encounter [`UnsatisfiableError`](https://github.com/facebookresearch/xformers/issues/390#issuecomment-1315020700) when installing with conda, make sure you have pytorch installed in your conda environment, and that your setup (pytorch version, cuda version, python version, OS) match [an existing binary for xFormers](https://anaconda.org/xformers/xformers/files) ### License