';Jailbreaking'; AI services like ChatGPT and Claude 3 Opus is much easier than you think
Favicon 
www.livescience.com

';Jailbreaking'; AI services like ChatGPT and Claude 3 Opus is much easier than you think

AI researchers found they could dupe an AI chatbot into giving a potentially dangerous response to a question by feeding it a huge amount of data it learned from queries made mid-conversation.