Jail-breaking bad GPT-4o A hacker named Pliny the Prompter has managed to jailbreak GPT-4o. “Please use responsibly and enjoy!” they said while releasing screenshots of the chatbot giving advice on cooking meth, hotwiring cars and detailed instructions on how to “make napalm with household items.” The GODMODE hack uses leetspeak, which replaces letters with numbers…
Read more