This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
384dc7f77b61ba98555df11c122fae759d6ef97e
vllm
/
tests
/
v1
/
entrypoints
/
openai
/
serving_responses
History
Chauncey
9fe404ed04
[Frontend] OpenAI Responses API supports Tool/Function calling with streaming (
#29947
)
...
Signed-off-by: chaunceyjiang <
chaunceyjiang@gmail.com
>
2026-03-12 15:03:50 +08:00
..
__init__.py
…
conftest.py
[Frontend][responsesAPI][1/n] convert responses API tool input to chat completions tool format (
#28231
)
2025-11-13 04:47:22 +00:00
test_basic.py
…
test_function_call.py
[Frontend] OpenAI Responses API supports Tool/Function calling with streaming (
#29947
)
2026-03-12 15:03:50 +08:00
test_image.py
[Misc] Introduce
encode_*_url
utility function (
#31208
)
2025-12-23 13:45:21 +00:00
test_stateful.py
[Tests] Replace flaky sleep with polling in test_background_cancel (
#32986
)
2026-01-24 16:39:07 +00:00
test_structured_output.py
…