PyPi: Vllm

CVE-2024-8939

Safety vulnerability ID: 73285

This vulnerability was reviewed by experts

The information on this page was manually curated by our Cybersecurity Intelligence Team.

Created at Sep 17, 2024 Updated at Nov 15, 2024
Scan your Python projects for vulnerabilities →

Advisory

Improper handling of the 'best_of' parameter in the ilab model serve component's vllm JSON web API can lead to a Denial of Service (DoS) vulnerability. When attackers set this parameter to a large value during LLM-based sentence or chat completion requests, the API fails to manage timeouts or resource exhaustion effectively. This oversight allows malicious actors to consume excessive system resources, rendering the API unresponsive and blocking legitimate users from accessing the service. The vulnerability stems from the API's attempt to return the best completion from several options without proper safeguards against resource-intensive requests.

Affected package

vllm

Latest version: 0.6.4.post1

A high-throughput and memory-efficient inference and serving engine for LLMs

Affected versions

Fixed versions

Vulnerability changelog

This vulnerability has no description

Resources

Use this package?

Scan your Python project for dependency vulnerabilities in two minutes

Scan your application