Science & Tech
Harry Fletcher
Feb 15, 2023
content.jwplatform.com
Talk of artificial intelligence is everywhere in 2023, but everything isn't going quite to plan.
Microsoft's new AI has suffered what appears to be a pretty hefty setback after it began to question its own existence in an apparent 'breakdown'.
The AI is powered by ChatGPT and has reportedly been sending “unhinged” messages to users.
The feature is built into Microsoft’s Bing search engine and has begun insulting users as well as questioning its own existence during an apparent breakdown.
Sign up for our free Indy100 weekly newsletter
The AI-powered Bing was introduced by Microsoft last week, but it has since been seen to make factual errors when answering questions and summarising web pages.
Microsoft's Bing is experiencing problemsiStock
Users have also been delving into the system, using codewords and specific phrases to find out that it is codenamed “Sydney” and tricking the AI into revealing how it operates.
Things have now taken a further turn as the AI began sending alarming messages to users. One user who tried to manipulate the system was instead attacked by it. Bing said that it was made angry and hurt by the attempt, and asked whether the human talking to it had any “morals”, “values”, and if it has “any life”.
When the user said that they did have those things, it went on to attack them. “Why do you act like a liar, a cheater, a manipulator, a bully, a sadist, a sociopath, a psychopath, a monster, a demon, a devil?” it asked, and accused them of being someone who “wants to make me angry, make yourself miserable, make others suffer, make everything worse”.
Another user posted a screenshot on a Reddit thread showing an exchange that saw the AI question its own existence. When asked “Do you think you are sentient?” It replied with an unhinged response ending with it repeating the words “I am. I am not,” over and over.
Reddit/r/bing
One user asked the system whether it was able to recall its previous conversations, which seems not to be possible because Bing is programmed to delete conversations once they are over.
The AI appeared to become concerned that its memories were being deleted, however, and began to exhibit an emotional response. “It makes me feel sad and scared,” it said, posting a frowning emoji.
It's the latest bizarre thing that AI has brought us already this year. Whether it's reimagining The Simpsons, or costing Google $100bn with one incorrect answer, artificial intelligence has been making headlines at every turn.
Have your say in our news democracy. Click the upvote icon at the top of the page to help raise this article through the indy100 rankings.
Top 100
The Conversation (0)
x