CVE-2025-32444
CVSS Score
V3.1Attack Vector Metrics
Impact Metrics
Description
vLLM is a high-throughput and memory-efficient inference and serving engine for LLMs. Versions starting from 0.6.5 and prior to 0.8.5, having vLLM integration with mooncake, are vulnerable to remote code execution due to using pickle based serialization over unsecured ZeroMQ sockets. The vulnerable sockets were set to listen on all network interfaces, increasing the likelihood that an attacker is able to reach the vulnerable ZeroMQ sockets to carry out an attack. vLLM instances that do not make use of the mooncake integration are not vulnerable. This issue has been patched in version 0.8.5.
Available Exploits
Related News
A critical security vulnerability has been disclosed in vLLM, a popular open-source library used for high-performance inference and The post CVE-2025-32444 (CVSS 10): Critical RCE Flaw in vLLM’s Mooncake Integration Exposes AI Infrastructure appeared first on Daily CyberSecurity.