why are there fewer and fewer jailbreaks?

ChatGPT incorporates various restrictions to ensure responsible, ethical and law-abiding use. Nevertheless, some users have gone beyond these restrictions to push the limits of its programming. This has given rise to the jailbreak phenomenon, which seems to have subsided by now. Find out why.

Jailbreaks less useful than ChatGPT’s improved performance

ChatGPT’s jailbreaks appeared almost at the same time as the chatbot itself. The operation made it possible to unleash the full potential of the generative AI tool. This is achieved by removing the limitations imposed by developers. As a result, users can personalize their responses, adjust the AI’s decision-making process and improve their capabilities beyond their default parameters.

If jailbreaks boomed in parallel with the craze for early versions of ChatGPT, these practices are becoming increasingly rare.. Experts explain this sliding popularity by the improved performance of the latest versions of ChatGPT coupled with better control of prompts.

A large proportion of ChatGPT users are now able toharness the power of prompts to achieve objectives. The very ones for which they would previously have needed jailbreaks. In addition, some startups have developed uncensored versions of the chatbot that do everything (or almost everything) you ask of them, just like a jailbreak.

Jailbreaks increasingly inaccessible

At first, jailbreaks were particularly accessible. All users had to do was copy and paste prompts available onlineon forums and other dedicated groups. With DAN (Do Anything Now), for example, seemingly innocuous instructions forced the chatbot to do whatever was asked of it.

With the many enhancements and robust protections made to ChatGPT, it harder to fool the chatbot with these prompts. Jailbreak now requires complex techniques to have any chance of success. This discourages users. What’s more, OpenAI quickly corrects any vulnerability whenever exploits are shared by jailbreakers.

Finally, more and more jailbreakers are monetizing their work. Some ChatGPT users have even turned this practice into a lucrative business. These professional jailbreak creators design prompts that perform specific tasks.

They then offer them for sale on marketplaces such as PromptBase. The pay aspect has made users more reticent. Jailbreaks haven’t disappeared completely. They have simply gone underground and become more difficult to access.