Executive Summary



This vulnerability is currently undergoing analysis and not all information is available. Please check back soon to view the completed vulnerability summary
Informations
Name CVE-2024-12704 First vendor Publication 2025-03-20
Vendor Cve Last vendor Modification 2025-03-20

Security-Database Scoring CVSS v3

Cvss vector : N/A
Overall CVSS Score NA
Base Score NA Environmental Score NA
impact SubScore NA Temporal Score NA
Exploitabality Sub Score NA
 
Calculate full CVSS 3.0 Vectors scores

Security-Database Scoring CVSS v2

Cvss vector :
Cvss Base Score N/A Attack Range N/A
Cvss Impact Score N/A Attack Complexity N/A
Cvss Expoit Score N/A Authentication N/A
Calculate full CVSS 2.0 Vectors scores

Detail

A vulnerability in the LangChainLLM class of the run-llama/llama_index repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The stream_complete method executes the llm using a thread and retrieves the result via the get_response_gen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the _llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the get_response_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.

Original Source

Url : http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2024-12704

CWE : Common Weakness Enumeration

% Id Name
100 % CWE-755 Improper Handling of Exceptional Conditions

Sources (Detail)

https://github.com/run-llama/llama_index/commit/d1ecfb77578d089cbe66728f18f63...
https://huntr.com/bounties/a0b638fd-21c6-4ba7-b381-6ab98472a02a
Source Url

Alert History

If you want to see full details history, please login or register.
0
Date Informations
2025-03-20 13:20:36
  • First insertion