Logo
Explore Help
Register Sign In
biondizzle/vllm
1
0
Fork 0
You've already forked vllm
Code Issues Pull Requests Actions 2 Packages Projects Releases Wiki Activity
Files
b4fe16c75b437794900afcc3e1aa53df34e5ea38
vllm/vllm/lora
History
Jee Jee Li a26f59ccbc [Misc] Raise error for V1 not supporting Long LoRA. (#16415)
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
2025-04-11 01:51:20 -07:00
..
ops
[Bugfix] LoRA : Fix the order in which the kernels process LoRAs (#16040)
2025-04-06 14:04:50 +00:00
punica_wrapper
[Misc] Remove LoRA log (#15388)
2025-03-24 20:43:48 -07:00
__init__.py
[Experimental] Add multi-LoRA support (#1804)
2024-01-23 15:26:37 -08:00
fully_sharded_layers.py
[Misc] Improve LoRA spelling (#13831)
2025-02-25 23:43:01 -08:00
layers.py
[Core][LoRA][1/N] Add LoRA for EncoderDecoderModelRunner (#15990)
2025-04-11 15:32:37 +08:00
lora.py
[Misc] Add SPDX-License-Identifier headers to python source files (#12628)
2025-02-02 11:58:18 -08:00
models.py
[Misc] Raise error for V1 not supporting Long LoRA. (#16415)
2025-04-11 01:51:20 -07:00
peft_helper.py
[Misc] Improve LoRA spelling (#13831)
2025-02-25 23:43:01 -08:00
request.py
[Misc] Add SPDX-License-Identifier headers to python source files (#12628)
2025-02-02 11:58:18 -08:00
utils.py
[LoRA] Remove linear hack outside transformers backend (#14177)
2025-03-05 15:06:28 +00:00
worker_manager.py
[Misc] Reduce LoRA-related static variable (#13166)
2025-02-22 00:21:30 -08:00
Powered by Gitea Version: 1.25.2 Page: 18ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API