CVE-2025-62426
BaseFortify
Publication date: 2025-11-21
Last updated on: 2025-12-04
Assigner: GitHub, Inc.
Description
Description
CVSS Scores
EPSS Scores
| Probability: | |
| Percentile: |
Meta Information
Affected Vendors & Products
| Vendor | Product | Version / Range |
|---|---|---|
| vllm | vllm | From 0.5.5 (inc) to 0.11.1 (exc) |
| vllm | vllm | 0.11.1 |
| vllm | vllm | 0.11.1 |
Helpful Resources
Exploitability
| CWE ID | Description |
|---|---|
| CWE-770 | The product allocates a reusable resource or group of resources on behalf of an actor without imposing any intended restrictions on the size or number of resources that can be allocated. |
Attack-Flow Graph
AI Powered Q&A
Can you explain this vulnerability to me?
This vulnerability in vLLM versions 0.5.5 to before 0.11.1 involves the /v1/chat/completions and /tokenize endpoints accepting a chat_template_kwargs request parameter that is used before proper validation against the chat template. An attacker can craft specific chat_template_kwargs parameters to block the API server's processing for long periods, causing delays to all other requests. The issue was fixed in version 0.11.1.
How can this vulnerability impact me? :
The vulnerability can cause denial of service by blocking the API server's processing for extended periods, which delays or prevents other legitimate requests from being handled. This can disrupt availability and reliability of services relying on the vLLM API.
What immediate steps should I take to mitigate this vulnerability?
Upgrade vLLM to version 0.11.1 or later, where the vulnerability has been patched.