PyPi: Llama-Index

CVE-2024-12704

Safety vulnerability ID: 76267

This vulnerability was reviewed by experts

The information on this page was manually curated by our Cybersecurity Intelligence Team.

Created at Mar 20, 2025 Updated at Apr 02, 2025
Scan your Python projects for vulnerabilities →

Advisory

A vulnerability in the LangChainLLM class of the run-llama/llamaindex repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The streamcomplete method executes the llm using a thread and retrieves the result via the getresponsegen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the getresponse_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.

Affected package

llama-index

Latest version: 0.12.28

Interface between LLMs and your data

Affected versions

Fixed versions

Vulnerability changelog

This vulnerability has no description

Resources

Use this package?

Scan your Python project for dependency vulnerabilities in two minutes

Scan your application