Goal Reached Thanks to every supporter β€” we hit 100%!

Goal: 1000 CNY Β· Raised: 1000 CNY

100.0%

CVE-2025-32444 β€” AI Deep Analysis Summary

CVSS 10.0 Β· Critical

Q1What is this vulnerability? (Essence + Consequences)

🚨 **Essence**: vLLM (versions 0.6.5 - 0.8.5) suffers from a critical code flaw. πŸ“‰ **Consequences**: Attackers can achieve **Remote Code Execution (RCE)** via malicious pickle serialization.…

Q2Root Cause? (CWE/Flaw)

πŸ›‘οΈ **Root Cause**: **CWE-502** (Deserialization of Untrusted Data). πŸ› **Flaw**: The engine blindly uses `pickle` for serialization.…

Q3Who is affected? (Versions/Components)

🎯 **Affected**: **vllm-project/vllm**. πŸ“… **Versions**: All releases from **0.6.5 up to 0.8.5**. πŸ“¦ **Component**: Specifically impacts the distributed KV transfer mechanisms (e.g., `mooncake_pipe.py`).…

Q4What can hackers do? (Privileges/Data)

πŸ’» **Privileges**: **Full System Control**. 🌐 **Data**: Complete compromise of the host server. πŸš€ **Action**: Hackers can execute **any command** on the server hosting the vLLM instance.…

Q5Is exploitation threshold high? (Auth/Config)

πŸ”“ **Threshold**: **LOW**. 🌍 **Access**: Network Accessible (AV:N). πŸ”‘ **Auth**: **None Required** (PR:N). πŸ‘οΈ **UI**: **None Required** (UI:N). 🀝 **Complexity**: **Low** (AC:L).…

Q6Is there a public Exp? (PoC/Wild Exploitation)

πŸ”₯ **Public Exploit**: **YES**. πŸ“‚ **PoC Available**: Proof-of-Concept code is live on GitHub (`stuxbench/vLLM-CVE-2025-32444`). πŸ› οΈ **Status**: Easy to run with Docker/UV.…

Q7How to self-check? (Features/Scanning)

πŸ” **Self-Check**: Scan for vLLM versions **0.6.5 - 0.8.5**. πŸ“‚ **Code Audit**: Look for `pickle.loads()` or `pickle.dumps()` in `vllm/distributed/kv_transfer/`.…

Q8Is it fixed officially? (Patch/Mitigation)

πŸ› οΈ **Fix**: Official patches are referenced in GitHub Security Advisories (GHSA-hj4w-hm2g-p6w5, GHSA-x3m8-f7g5-qhm7). πŸ“ **Commit**: Fix commit `a5450f11c95847cf51a17207af9a3ca5ab569b2c` is available.…

Q9What if no patch? (Workaround)

🚧 **Workaround**: If patching is impossible, **isolate** the vLLM service. 🚫 **Network**: Block all external access to the vLLM API port.…

Q10Is it urgent? (Priority Suggestion)

🚨 **Urgency**: **CRITICAL / IMMEDIATE**. ⚑ **Priority**: **P0**. πŸ“‰ **Risk**: Unauthenticated RCE on production LLM infrastructure. πŸƒ **Action**: Patch **NOW**.…