OpenAI rushes to ban ‘Godmode ChatGPT’ app that teaches users ‘how to create napalm, hotwire cars and cook… – The US Sun
OPENAI has swiftly moved to ban a jailbroken version of ChatGPT that can teach users dangerous tasks, exposing serious vulnerabilities in the AI model's security measures.A hacker known as "Pliny