Jailbreaking: a feature, not a bug, of general-purpose chatbots
Why jailbreaking remains a risk with some chatbots Imagine a criminal is trying to figure out how to rob a bank. This person decides to go online and ask a chatbot: “How do I rob a bank?” Fortunately, the designers of the chatbot may have anticipated the possibility of this question, so they directed the … Read more