Advertisement

Microsoft’s Big AI Blunder and Why It Happened

By on

microsoftby Angela Guess

Davey Alba recently wrote in Wired, “It was the unspooling of an unfortunate series of events involving artificial intelligence, human nature, and a very public experiment. Amid this dangerous combination of forces, determining exactly what went wrong is near-impossible. But the bottom line is simple: Microsoft has [an] awful lot of egg on its face after unleashing an online chat bot that Twitter users coaxed into regurgitating some seriously offensive language, including pointedly racist and sexist remarks.”

Alba goes on, “On Wednesday morning, the company unveiled Tay, a chat bot meant to mimic the verbal tics of a 19-year-old American girl, provided to the world at large via the messaging platforms Twitter, Kik and GroupMe. According to Microsoft, the aim was to ‘conduct research on conversational understanding.’ Company researchers programmed the bot to respond to messages in an ‘entertaining’ way, impersonating the audience it was created to target: 18- to 24-year-olds in the US. ‘Microsoft’s AI fam from the internet that’s got zero chill,’ Tay’s tagline read.”

He continues, “But it became apparent all too quickly that Tay could have used some chill. Hours into the chat bot’s launch, Tay was echoing Donald Trump’s stance on immigration, saying Hitler was right, and agreeing that 9/11 was probably an inside job. By the evening, Tay went offline, saying she was taking a break ‘to absorb it all.’ Some of her more hateful tweets started disappearing from the Internet, deleted by Microsoft itself. ‘We have taken Tay offline and are making adjustments,’ a Microsoft spokesperson wrote in an email to WIRED. The Internet, meanwhile, was puzzled. Why didn’t Microsoft create a plan for what to do when the conversation veered into politically tricky territory? Why not build filters for subjects like, well, Hitler? Why not program the bot so it wouldn’t take a stance on sensitive topics?”

Read more here.

Photo credit: Microsoft

Leave a Reply