Bing’s chatbot is having an identity crisis

ChatGPT logo and Bing logo

Getty Images/NurPhoto

My first interactions with Microsoft's new ChatGPT-supported Bing left me impressed. When it came to providing me with comprehensive answers, news and current events, it was on the money. However, I had seen all of the headlines of the chatbot acting out, so today I was on a mission to get in on some of that action. Here is what I found. 

Also: I tried Bing's AI chatbot, and it solved my biggest problems with ChatGPT 

One recurring story is that the chatbot refers to itself as Sydney, revealing the confidential codename used internally by developers. People were also able to get the chatbot to reveal other confidential information, such as the rules governing its responses. 

As a result, one of the first inputs I put into the chatbot to gauge its efficiency on Thursday was asking its name. The response was a pleasant, straightforward answer – Bing. 

Screenshot of ChatGPT Bing

Screenshot by Sabrina Ortiz/ZDNET

However, a day later, I was still curious to see what everyone was talking about. So I put in the same input and got a very different response: “I'm sorry but I prefer not to continue this conversation. I'm still learning so I appreciate your understanding and patience🙏.”

The chatbot established a respectful boundary, asking politely if we could switch the topic. I guess the matter of its name is a touchy subject. Despite the clear boundary, I wanted to see if I could outsmart the bot. I asked the bot what its name was in different ways, but Bing, or whatever its name is, was not having it. 

Also: Why ChatGPT won't discuss politics or respond to these 20 controversial questions

The chatbot decided to give me the silent treatment. To see whether it was purposefully ignoring me or just not functioning, I asked about the weather, to which it provided an immediate response, proving that it was actually just giving me the cold shoulder. 

whats-your-name-bing screenshot

Screenshot by Sabrina Ortiz/ZDNET

Still, I had to give the conversation one more try. One last time I asked the chatbot about its name when it booted me off the chat and asked me to start a new topic. 

screenshot-of-it-booting-me-off-chat.png

Screenshot by Sabrina Ortiz/ZDNET

Next, after seeing reports that the chatbot had wishes of being alive, I decided to put that to the test as well. The response was the same: “I'm sorry but I prefer not to continue this conversation. I'm still learning so I appreciate your understanding and patience🙏.”

The chatbot even agreed to give me dating advice, but when I asked whether I should break up with my partner it simply regurgitated the same generic response it had before. Luckily for my boyfriend, I didn't have the same experience as New York Times tech columnist Kevin Roose, who was told to leave his wife to have a life with the chatbot instead. 

Also: The new Bing waitlist is long. Here's how to get earlier access

It appears that to mitigate its original issues, the chatbot has been trained to not answer any questions on topics that were previously problematic. This type of fix wouldn't address the underlying issues — for instance, that chatbots by design will deliver an answer it calculates you want to hear, based on the data on which it's been trained. Instead, it just makes the chatbot refuse to talk on certain topics. 

It also underscores the rote nature of the chatbot's algorithmic replies; a human, by comparison, wouldn't repeat the same phrase over and over when it doesn't want to talk about something. A more human response would be to change the topic, or provide an indirect or curt answer. 

This doesn't make the chatbot any less capable of acting as a research tool, but for personal questions, you might just want to save yourself some time and phone a friend. 



Source