But by that time, the damage had already been done, and the company has since apologized in a blog post entitled “Learning from Tay’s introduction.” Microsoft Research Corporate VP Peter Lee explained that this isn’t the company’s first foray into a socially-inclined AI chatbot, and pointed to Microsoft’s work with the Xiaolce chatbot, which is used by over 40 million people in China.Microsoft even went so far as to implement a number of filters and conducted intense user studies to ensure that Tay would be ready for primetime.This “parrot” mentality is the reason why Tay went off message, calling President Barack Obama a monkey, embracing neo-Nazi rhetoric, and coming on to users with the promise of cyber sex.
In that sense, the challenges are just as much social as they are technical.” Although Lee doesn’t specify what exploit was taken advantage of in order to turn Tay into a hate monger, he says that Microsoft in the future will work to the best of its ability to “limit technical exploits” that could cause future embarrassments.
Microsoft’s Tay is an artificial intelligent chat bot developed by their Technology and Research and Bing teams.
According to Microsoft, they targeted people between 18 and 24 year old in the US, since this is the dominant user group on mobile devices, but you can chat with Tay, no matter the age or location if you don’t mind US millennials jokes.
You can ask Tay for a joke, play a game, ask for a story, send a picture to receive a comment back, ask for your horoscope and more.
“As a result, Tay tweeted wildly inappropriate and reprehensible words and images.
We take full responsibility for not seeing this possibility ahead of time.
Microsoft shocked us all earlier this week when it released its Tay chatbot into the world of social media.
Tay, which is patterned after a typical millennial female between the age of 18 and 24, seemed innocent enough, signing on to Twitter with the following greeting: However, it didn’t take long for nefarious Twitter users to poison the well by exploiting Tay’s penchant for repeating statements fed to it.
Tay may use the data that you provide to search on your behalf.