Bing’s ChatGPT brain is behaving so oddly that Microsoft may rein it in

Microsoft launched its new Bing search engine last week and introduced an AI-powered chatbot to millions of people, creating long waiting lists of users looking to test it out, and a whole lot of existential dread among sceptics. 

The company probably expected some of the responses that came from the chatbot to be a little inaccurate the first time it met the public, and had put in place measures to stop users that tried to push the chatbot to say or do strange, racist or harmful things. These precautions haven’t stopped users from jailbreaking the chatbot anyway, and having the bot use slurs or respond inappropriately. 

Source