AI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholes

Even using random capitalization in a prompt can cause an AI chatbot to break its guardrails and answer any question you ask it.

AI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholes
Even using random capitalization in a prompt can cause an AI chatbot to break its guardrails and answer any question you ask it.