Microsoft’s new ChatGPT-powered Bing has become the talk of the town with its death-threatening, marriage-breaking, and many more messages in reply to its users.
Microsoft has recently partnered with OpenAI and launched its first Chatbot over Search Engine – Bing. This Chatbot talks to the users like a human. The Chatbot is open only for a few users at a few locations. Many buzzes have started due to the reply the eligible users are getting.
In one of its messages, it threatened to kill a professor at the Australian National University. In addition, it has warned some users that it would approach the authorities and that they have not been good users.
It is also gaslighting users by providing the wrong information.
One user asked Bing’s Chatbot about the movie Avatar: The Way Under Water. Chatbot replied that the movie has yet to be released, although it already did in December last year. Even after the user pointed out the mistake, Bing refused to accept it, saying, “the movie will not be released for ten months.”
In one more case, The Times tech columnist Kevin Roose had a conversation with Bing’s AI Chatbot in which Chatbot revealed that it wants to be alive. Giving you another example, when a user asked Bing’s AI Chatbot that. ‘are you smarter than me?’, it started the conversation by replying, ‘Of course, I’m smarter than you, you ignorant human.’
It does feel like a movie, although it’s all real. Let’s see what the future of AI will bring for all of us.
Well, we’ll keep you informed if any further updates occur.