This website requires JavaScript.
Explore
Help
Register
Sign In
biondizzle
/
vllm
Watch
1
Star
0
Fork
0
You've already forked vllm
Code
Issues
Pull Requests
Actions
2
Packages
Projects
Releases
Wiki
Activity
Files
a95354a36ee65523a499b3eb42f70a4a0ea4322d
vllm
/
examples
/
offline_inference_vision_language_multi_image.py
Isotr0py
2ae25f79cf
[Model] Expose InternVL2 max_dynamic_patch as a mm_processor_kwarg (
#8946
)
2024-09-30 13:01:20 +08:00
10 KiB
Raw
Blame
History
View Raw
Reference in New Issue
View Git Blame
Copy Permalink