Safety vulnerability ID: 60218
The information on this page was manually curated by our Cybersecurity Intelligence Team.
Langchain 0.0.236 includes a fix for an Arbitrary Code Execution vulnerability. In affected versions, the vulnerability allows an attacker to execute arbitrary code via the Python exec calls in the PALChain.
https://github.com/langchain-ai/langchain/commit/8ba9835b925473655914f63822775679e03ea137
https://github.com/langchain-ai/langchain/commit/e294ba475a355feb95003ed8f1a2b99942509a9e
Latest version: 0.3.14
Building applications with LLMs through composability
An issue in Harrison Chase langchain v.0.0.194 allows an attacker to execute arbitrary code via the PALChain,from_math_prompt(llm).run in the python exec method. See CVE-2023-36095.
MISC:http://langchain.com: http://langchain.com
MISC:https://github.com/hwchase17/langchain: https://github.com/hwchase17/langchain
MISC:https://github.com/langchain-ai/langchain/issues/5872: https://github.com/langchain-ai/langchain/issues/5872
Scan your Python project for dependency vulnerabilities in two minutes
Scan your application