Microsoft 'deeply sorry' for Tay chatbot, will bring it back when 'vulnerability' is fixed

The Internet just can't have nice things, apparently. At least not in the west.

A Microsoft executive said Friday that the company was “deeply sorry” for the “unintended offensive and hurtful” tweets the company’s Tay chatbot delivered earlier this week.

“Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values,” Peter Lee, the corporate vice president in charge of Microsoft Research, wrote in a blog post.

While that echoes the message that Microsoft delivered earlier, Lee attempted to show how Tay wasn’t simply unleashed onto the Internet without preparation. Tay was the outgrowth of a similar Microsoft chatbot known as XiaoIce, which is already “delighting with its stories and conversations” 40 million people in China.

“The great experience with XiaoIce led us to wonder: Would an AI like this be just as captivating in a radically different cultural environment?” Lee wrote. “Tay – a chatbot created for 18- to 24- year-olds in the U.S. for entertainment purposes – is our first attempt to answer this question.”

Why this matters: Lee’s disclosure that Microsoft has already released a chatbot that 20 million Chinese people are using with civility makes the Tay debacle even more humiliating for the Western world. Microsoft and Lee are clearly embarrassed, but it’s difficult to tell whether they’re ashamed of their own failure, or of the audience that abused Tay’s algorithm. Perhaps there’s a lesson here: Social constructs have to be thought of in terms of social vulnerabilities in the same way software must be constructed with security exploits in mind.

tay

Just one of the bizarre tweets issued by the Tay chatbot from Microsoft.

Tay’s troubled past

Lee wrote that Tay had been developed with filtering built in, and had been tested with “diverse” user groups. “We stress-tested Tay under a variety of conditions, specifically to make interacting with Tay a positive experience,” Lee wrote.

Tay’s platforms included Qik and Twitter, and the latter platform became the true test for Tay’s maturity. Within 24 hours of coming online, Lee wrote that Tay had been subject to a “coordinated attack by a subset of people.”

“Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack,” Lee wrote. “As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time.”

Lee didn’t say how the attack worked, specifically, but many believe that by asking the Tay bot to “repeat after me,” Tay would not only parrot the phrase but also “learn” it, and incorporate it into her vocabulary.

Lee wrote that Microsoft sees Tay as a research effort, and that AI systems feed off both positive and negative interactions with people. The problem, of course, is how Microsoft will reintroduce Tay publicly, with the risk that the same vulnerability, or a different one, may be used to offend others.

“To do AI right, one needs to iterate with many people and often in public forums,” Lee wrote. “We must enter each one with great caution and ultimately learn and improve, step by step, and to do this without offending people in the process.”

And right now, Microsoft doesn’t seem to have a ready answer.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags social mediaMicrosoft

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.
Mark Hachman

Mark Hachman

PC World (US online)
Show Comments

Most Popular Reviews

Latest Articles

Resources

PCW Evaluation Team

Cate Bacon

Aruba Instant On AP11D

The strength of the Aruba Instant On AP11D is that the design and feature set support the modern, flexible, and mobile way of working.

Dr Prabigya Shiwakoti

Aruba Instant On AP11D

Aruba backs the AP11D up with a two-year warranty and 24/7 phone support.

Tom Pope

Dynabook Portégé X30L-G

Ultimately this laptop has achieved everything I would hope for in a laptop for work, while fitting that into a form factor and weight that is remarkable.

Tom Sellers

MSI P65

This smart laptop was enjoyable to use and great to work on – creating content was super simple.

Lolita Wang

MSI GT76

It really doesn’t get more “gaming laptop” than this.

Featured Content

Product Launch Showcase

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?