Artificial Intelligence, increasingly dangerous. What is “Godmode GPT” that teaches you how to make napalm and make meth in your kitchen
The jailbroken version is based on OpenAI's latest language model, GPT-4o, and can bypass many of OpenAI's safeguards. OpenAI PHOTO: