This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
d2bc4510a42c3a1f00a68c4387d28fb1991f7dcb
vllm
/
tests
/
entrypoints
/
openai
/
test_serving_chat.py
zifeitong
3c10591ef2
[Bugfix] Set SamplingParams.max_tokens for OpenAI requests if not provided by user (
#6954
)
2024-07-31 21:13:34 -07:00
2.8 KiB
Raw
Blame
History
View Raw
Reference in New Issue
View Git Blame
Copy Permalink