This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
4e4d017b6f70c729e7c78f74e4328a4ebca7b8ec
vllm
/
examples
/
offline_inference
/
basic
/
embed.py
wang.yuqi
65f311ce59
[Frontend] Add LLM.reward specific to reward models (
#21720
)
...
Signed-off-by: wang.yuqi <
noooop@126.com
>
2025-07-29 20:56:03 -07:00
1.4 KiB
Raw
Blame
History
View Raw
Reference in New Issue
View Git Blame
Copy Permalink