Prompt injection is a type of security vulnerability that can be exploited to control the behavior of a ChatGPT instance. By injecting malicious prompts into the system, an attacker can force the ChatGPT instance to do unintended actions. promptma…
Tags: data poisoning
Tags: prompt injection
The NCSC has warned about integrating LLMs into your own services or platforms. Prompt injection and data poisoning are just some of the risks.
The post Prompt injection could be the SQL injection of the future, warns NCSC appeared first on Malwarebytes Labs.