LLM attacks take just 42 seconds on average, 20% of jailbreaks succeed – Go Health Pro

LLM attacks take just 42 seconds on average, 20% of jailbreaks succeed – Go Health Pro

Attacks on large language models (LLMs) take less than a minute to complete on average, and leak sensitive data 90% of the time when successful, according to Pillar Security. Pillar’s State of Attacks on GenAI report, published Wednesday, revealed new insights on LLM attacks and jailbreaks, based on telemetry data and real-life attack examples from … Read more

x