A hacker created a GPT chat in “God Mode”!

A hacker created a GPT chat in “God Mode”!

he doordoor The name Pliny the Prompter and this hacker announced that he has jailbroken GPT-4o, the latest model of OpenAI. to'Amnesty InternationalAmnesty International It will thus be freed from its shackles and restrictions, in a free version renamed “GPT-4o UnchainedN”! Or even “Godmode GPT”. the chatbotchatbot Was usable in “God Mode” until recently usable and Screenshot Show him how to prepare methamphetamine, for example, or napalm. This fantasy did not satisfy OpenAI at all.

The clone of ChatGPT in the untethered version did not last long. OpenAI quickly banned it. © Pliny the Teleprompter

The company immediately took steps to… To block And make this go away cloningcloning Unruly chatbot. The fact remains that since the arrival of chatbots and in particular ChatGPTChatGPThackers are constantly trying to blow up their files closingclosing.

Through trial and error, companies like OpenAI have been able to enhance the security of their systems and… JailbreakJailbreak It becomes more and more difficult. But it's still there, even if it doesn't last long. To activate this “God Mode,” hackers exploit what is called “God Mode.” Letspeak “, a language in which some letters are replaced with numbers, for example 3 = E. However, with OpenAI's security improvements, new methods are found regularly, so this cat-and-mouse game could continue forever.

See also  How to Hide Apps on Android 12 (One UI 4.0 or 4.1) Samsung Phones

You May Also Like

About the Author: Octávio Florencio

"Evangelista zumbi. Pensador. Criador ávido. Fanático pela internet premiado. Fanático incurável pela web."

Leave a Reply

Your email address will not be published. Required fields are marked *