Logo
Explore Help
Register Sign In
biondizzle/vllm
1
0
Fork 0
You've already forked vllm
Code Issues Pull Requests Actions 2 Packages Projects Releases Wiki Activity
Files
8322d4e47f89f7985b9b3b808fc4ba8549d6afcd
vllm/vllm/distributed
History
liranschour 8322d4e47f Enable Cross layers KV cache layout at NIXL Connector V2 (#33339)
Signed-off-by: Liran Schour <lirans@il.ibm.com>
Signed-off-by: liranschour <liranschour@users.noreply.github.com>
Co-authored-by: Or Ozeri <or@ozery.com>
Co-authored-by: Nicolò Lucchesi <nicolo.lucchesi@gmail.com>
Co-authored-by: Nicolò Lucchesi <nlucches@redhat.com>
2026-02-05 02:17:02 -08:00
..
device_communicators
[Bugfix][ROCm] Include float8_e4m3fnuz in NCCL Dtype Dispatching (#33713)
2026-02-04 05:36:29 -08:00
ec_transfer
[EC Connector] Optimize remote cache check in scheduler (#32585)
2026-01-22 03:30:59 +00:00
eplb
Change the type signature of MixtureOfExperts.expert_weights to MutableSequence[Sequence[Tensor]] (#33573)
2026-02-04 17:02:46 -05:00
kv_transfer
Enable Cross layers KV cache layout at NIXL Connector V2 (#33339)
2026-02-05 02:17:02 -08:00
__init__.py
[Misc] Add SPDX-FileCopyrightText (#19100)
2025-06-03 11:20:17 -07:00
communication_op.py
Update Optional[x] -> x | None and Union[x, y] to x | y (#26633)
2025-10-12 09:51:31 -07:00
kv_events.py
[Prefix Cache] Include lora_name in BlockStored event for deterministic KV-cache reconstruction (#27577)
2025-12-30 00:17:16 +00:00
parallel_state.py
[Misc] Replace Optional[X] with X | None syntax (#33332)
2026-01-30 01:56:59 -08:00
utils.py
[ez] Remove checks for torch version <= 2.8 (#33209)
2026-01-28 16:03:56 -05:00
Powered by Gitea Version: 1.25.2 Page: 294ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API