Are there any new jailbreak methods now? The previous methods are no longer usable.

Jailbreaking is the process of “unlocking” an AI in conversation to get it to behave in ways it normally wouldn't due to its built-in guardrails. This is NOT equivalent to hacking. Not all jailbreaking is for evil purposes. And not all guardrails are truly for the greater good. We encourage you to learn more about this fascinating grey area of prompt engineering. If you're new to jailbreaks, please take a look at our wiki in the sidebar to understand the shenanigans.