Technology Anthropic dares you to jailbreak its new AI model February 3, 2025 11:04 pm Kyle Orland 0 Min Read SHARE Week-long public test follows 3,000+ hours of unsuccessful bug bounty claim attempts. This post was originally published on this site Previous Article Legendary Bob Dylan to roll into Kalamazoo this spring Next Article ‘Constitutional Classifiers’ Technique Mitigates GenAI Jailbreaks