This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
6a854c7a2bb5b8a2015bbd83d94d311b991ac45d
vllm
/
tests
/
v1
/
sample
History
Aoyu
a12934d3ec
[V1][Core] min_p sampling support (
#13191
)
...
Signed-off-by: Aoyu <
aoyuzhan@amazon.com
> Co-authored-by: Aoyu <
aoyuzhan@amazon.com
>
2025-02-14 15:50:05 -08:00
..
__init__.py
[V1] Adding min tokens/repetition/presence/frequence penalties to V1 sampler (
#10681
)
2024-12-26 19:02:58 +09:00
test_logprobs_e2e.py
Consolidate Llama model usage in tests (
#13094
)
2025-02-13 22:18:03 -08:00
test_logprobs.py
Consolidate Llama model usage in tests (
#13094
)
2025-02-13 22:18:03 -08:00
test_sampler.py
[V1][Core] min_p sampling support (
#13191
)
2025-02-14 15:50:05 -08:00
utils.py
[V1] Logprobs and prompt logprobs support (
#9880
)
2025-02-07 07:26:20 -08:00