Security

Jailbreaking LLM-Controlled Robots – Schneier on Security

Jailbreaking LLM-Controlled Robots

Surprising no one, it’s easy to trick an LLM-controlled robot into ignoring its safety instructions.

Posted on December 11, 2024 at 7:02 AM •
0 Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button