This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
4e4d017b6f70c729e7c78f74e4328a4ebca7b8ec
vllm
/
docker
/
Dockerfile.rocm
Cyrus Leung
8896eb72eb
[Deprecation] Remove
prompt_token_ids
arg fallback in
LLM.generate
and
LLM.embed
(
#18800
)
...
Signed-off-by: DarkLight1337 <
tlleungac@connect.ust.hk
>
2025-08-22 10:56:57 +08:00
3.7 KiB
Raw
Blame
History
View Raw
Reference in New Issue
View Git Blame
Copy Permalink