Tuesday, 26 August

Tuesday, 26 August2025

OpenAI Launches Bug Bounty: $25K for Universal GPT-5 Jailbreak

OpenAI Launches Bug Bounty: $25K for Universal GPT-5 Jailbreak
OpenAI has rolled out an invite-only bug bounty program for its GPT-5 model, offering a $25,000 reward to the first individual who develops a universal jailbreak prompt that bypasses moderation to answer all ten bio/chem safety questions from a clean chat. A $10,000 prize is also available for the first team achieving the feat with multiple prompts. Submissions open now, with testing beginning September 9, 2025, all under strict NDA.
Read full story at Inkl

Subscribe To Our Newsletter.

Full Name
Email