Microsoft’s ChatGPT powered AI chatbot gets into hilarious argument after giving WRONG answer to a super easy question

MICROSOFT’S attempt to build an AI bot like ChatGPT has fallen on its face after it got an incredibly easy question wrong.

To make matters worse, Microsoft’s AI-enhanced Bing didn’t take the correction on the chin or accept it was wrong with any grace whatsoever.

REDDIT / Alfred_Chicken
There are a number of examples of the new Bing chat “going out of control” on Reddit[/caption]

One user was testing Microsoft’s Bing bot to see when Avatar 2 is in cinemas. But Bing was unable to understand what the date was.

The AI bot failed to understand that it could be wrong, despite some coaxing.

Bing instead insisted it was correct and accused one of Microsoft’s beta testers of “not being a good user”.

The Microsoft chatbot then demanded the user admit they were wrong, stop arguing and start a new conversation with a “better attitude”.

Web developer Jon Uleis took to Twitter to voice his woes over Microsoft’s AI offering – which was designed to compete with Google’s attempt to net some of the attention AI is receiving right now.

“My new favourite thing – Bing’s new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says ‘You have not been a good user’,” he wrote.

“Why? Because the person asked where Avatar 2 is showing nearby.”

Only a handful of lucky users are currently able to use Bing – which has been injected with artificial intelligence (AI) to turn it into more of a chatbot than a search engine.

Bing incorporates the technology behind ChatGPT, which has quickly risen to fame after launching in November.

Many tech experts are sitting on a waitlist to be one of the first to trial Microsoft’s new AI.

But those who have being able to give the chatbot a spin are not as impressed as users first were with ChatGPT.

Founder of a search engine startup Kagi, Vladimir Prelovacin, said there are a number of examples of the new Bing chat “going out of control” on Reddit.

“Open ended chat in search might prove to be a bad idea at this time,” he wrote on Twitter.

“I have to say I sympathise with the engineers trying to tame this beast.”

REDDIT / Curious_Evolver
One user is testing Microsoft’s Bing bot to see when Avatar 2 is in cinemas[/caption]

REDDIT / Curious_Evolver
But Bing was unable to understand what the date was[/caption]

REDDIT / Curious_Evolver
The AI bot failed to understand that it could be wrong, despite some coaxing[/caption]

REDDIT / Curious_Evolver
Bing then began to become agitated at the user, for claiming that it was wrong[/caption]

REDDIT / Curious_Evolver
The Microsoft chatbot diverted attention away from its wrong answer and towards the users own intentions[/caption]

REDDIT / Curious_Evolver
Bing then demanded the user admit they were wrong, stop arguing and start a new conversation with a “better attitude”[/caption]

Best Phone and Gadget tips and hacks

Looking for tips and hacks for your phone? Want to find those secret features within social media apps? We have you covered…

How to delete your Instagram account
What does pending mean on Snapchat?
How to check if you’ve been blocked on WhatsApp
How to drop a pin on Google Maps
How can I change my Facebook password?
How to go live on TikTok
How to clear the cache on an iPhone
What is NFT art?
What is OnlyFans?
What does Meta mean?

Get all the latest WhatsApp, Instagram, Facebook and other tech gadget stories here.

We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]

  Read More 

Advertisements