CVE-2025-25183
vLLM using built-in hash() from Python 3.12 leads to predictable hash collisions in vLLM prefix cache
Severity Score
Exploit Likelihood
Affected Versions
Public Exploits
0Exploited in Wild
-Decision
Descriptions
vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs. Maliciously constructed statements can lead to hash collisions, resulting in cache reuse, which can interfere with subsequent responses and cause unintended behavior. Prefix caching makes use of Python's built-in hash() function. As of Python 3.12, the behavior of hash(None) has changed to be a predictable constant value. This makes it more feasible that someone could try exploit hash collisions. The impact of a collision would be using cache that was generated using different content. Given knowledge of prompts in use and predictable hashing behavior, someone could intentionally populate the cache using a prompt known to collide with another prompt in use. This issue has been addressed in version 0.7.2 and all users are advised to upgrade. There are no known workarounds for this vulnerability.
CVSS Scores
SSVC
- Decision:Track
Timeline
- 2025-02-03 CVE Reserved
- 2025-02-07 CVE Published
- 2025-02-12 CVE Updated
- 2025-03-30 EPSS Updated
- ---------- Exploited in Wild
- ---------- KEV Due Date
- ---------- First Exploit
CWE
- CWE-354: Improper Validation of Integrity Check Value
CAPEC
References (3)
URL | Tag | Source |
---|---|---|
https://github.com/python/cpython/commit/432117cd1f59c76d97da2eaff55a7d758301dbc7 | X_refsource_misc | |
https://github.com/vllm-project/vllm/pull/12621 | X_refsource_misc | |
https://github.com/vllm-project/vllm/security/advisories/GHSA-rm76-4mrf-v9r8 | X_refsource_confirm |
URL | Date | SRC |
---|
URL | Date | SRC |
---|
URL | Date | SRC |
---|
Affected Vendors, Products, and Versions
Vendor | Product | Version | Other | Status | ||||||
---|---|---|---|---|---|---|---|---|---|---|
Vendor | Product | Version | Other | Status | <-- --> | Vendor | Product | Version | Other | Status |
Vllm-project Search vendor "Vllm-project" | Vllm Search vendor "Vllm-project" for product "Vllm" | < 0.7.2 Search vendor "Vllm-project" for product "Vllm" and version " < 0.7.2" | en |
Affected
|