Page 4 of 16 results (0.000 seconds)

CVSS: 9.8EPSS: 0%CPEs: 1EXPL: 2

In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method. • https://github.com/hwchase17/langchain/issues/1026 https://github.com/hwchase17/langchain/issues/814 https://github.com/hwchase17/langchain/pull/1119 https://twitter.com/rharang/status/1641899743608463365/photo/1 • CWE-74: Improper Neutralization of Special Elements in Output Used by a Downstream Component ('Injection') •