2025. máj. 30.
Riasztás
NA - CVE-2025-48944 - vLLM is an inference and serving engine for...
vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to...
Tovább