Replace FlashAttention with xformers (#70)
This commit is contained in:
@@ -3,11 +3,7 @@
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
pip install psutil numpy ray torch
|
||||
pip install git+https://github.com/huggingface/transformers # Required for LLaMA.
|
||||
pip install sentencepiece # Required for LlamaTokenizer.
|
||||
pip install ninja # To parallelize the compilation of flash-attn.
|
||||
pip install flash-attn # This may take up to 10 mins.
|
||||
pip install ninja psutil numpy sentencepiece ray torch transformers xformers
|
||||
pip install -e .
|
||||
```
|
||||
|
||||
|
||||
Reference in New Issue
Block a user