Logo
Explore Help
Register Sign In
biondizzle/vllm
1
0
Fork 0
You've already forked vllm
Code Issues Pull Requests Actions 2 Packages Projects Releases Wiki Activity
Files
8aada19dfc7247f2fdf3935663d851c6d6b13039
vllm/tests/models
History
Isotr0py 67ef8f666a [Model] Enable quantization support for transformers backend (#12960)
2025-02-17 19:52:47 -08:00
..
decoder_only
[Model] Support Mamba2 (Codestral Mamba) (#9292)
2025-02-17 20:17:50 +08:00
embedding
[VLM] Update compatibility with transformers 4.49
2025-02-05 19:09:45 -08:00
encoder_decoder
[VLM] Implement merged multimodal processor for Mllama (#11427)
2025-02-12 20:26:21 -08:00
fixtures
[CI/Build] Update pixtral tests to use JSON (#8436)
2024-09-13 03:47:52 +00:00
multimodal
[VLM] Merged multi-modal processor for Molmo (#12966)
2025-02-13 04:34:00 -08:00
__init__.py
[CI/Build] Move test_utils.py to tests/utils.py (#4425)
2024-05-13 23:50:09 +09:00
registry.py
[Model] Support Mamba2 (Codestral Mamba) (#9292)
2025-02-17 20:17:50 +08:00
test_initialization.py
[VLM] Separate text-only and vision variants of the same model architecture (#13157)
2025-02-13 06:19:15 -08:00
test_oot_registration.py
[Model]: Add transformers backend support (#11330)
2025-02-03 21:30:38 +08:00
test_registry.py
[Misc] Add SPDX-License-Identifier headers to python source files (#12628)
2025-02-02 11:58:18 -08:00
test_transformers.py
[Model] Enable quantization support for transformers backend (#12960)
2025-02-17 19:52:47 -08:00
utils.py
[Misc] Add SPDX-License-Identifier headers to python source files (#12628)
2025-02-02 11:58:18 -08:00
Powered by Gitea Version: 1.25.2 Page: 159ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API