2025. máj. 30.
Riasztás
NA - CVE-2025-48942 - vLLM is an inference and serving engine for...
vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param...
Tovább