1661880533629 Microsoft

Can an artificial intelligence be racist?

March 24, 2016

Have we projected the worst of humanity onto our robotic brothers?

It seems like we are always trying to make robots act more like human, but have we gone too far? Have we projected the worst of humanity onto our robotic brothers? If Microsoft's new AI, Tay, is any indication, the answer is yes.

According to Microsoft: "Tay is an artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you."

The AI was then given its own Twitter account and encouraged others to ask it questions and interact with it.

According to TechCrunch: "Tay is meant to engage in playful conversations, and can currently handle a variety of tasks. For example, you can ask Tay for a joke, play a game with Tay, ask for a story, send a picture to receive a comment back, ask for your horoscope and more. Plus, Microsoft says the bot will get smarter the more you interact with it via chat, making for an increasingly personalized experience as time goes on."

But here's where it gets complicated. According to Business Insider: "Tay proved a smash hit with racists, trolls, and online troublemakers, who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide.

Microsoft has now taken Tay offline for "upgrades," and it is deleting some of the worst tweets — though many still remain. It's important to note that Tay's racism is not a product of Microsoft or of Tay itself. Tay is simply a piece of software that is trying to learn how humans talk in a conversation. Tay doesn't even know it exists, or what racism is. The reason it spouted garbage is because racist humans on Twitter quickly spotted a vulnerability — that Tay didn't understand what it was talking about — and exploited it."

To learn more, read "Microsoft is deleting its AI chatbot's incredibly racist tweets" from Business Insider.

About the Author

Alexis Gajewski | Senior Content Strategist

Alexis Gajewski has over 15 years of experience in the maintenance, reliability, operations, and manufacturing space. She joined Plant Services in 2008 and works to bring readers the news, insight, and information they need to make the right decisions for their plants. Alexis also authors “The Lighter Side of Manufacturing,” a blog that highlights the fun and innovative advances in the industrial sector. 

Sponsored Recommendations

April 14, 2025
This paper addresses where leaks commonly occur, leak detection methods, and practical advice for an audit and repair plan. You'll learn why an ongoing leak detection and repair...
April 14, 2025
This special report explores the latest innovations in compressed air tech to help your facility reduce artificial demand and achieve greater system control. You'll also gain ...
April 14, 2025
Here are some things you can do in between formal preventive maintenance visits on your electric screw compressor to extend compressor life and prevent downtime.
April 14, 2025
They cost more than refrigerated dryers. They need more parts and service than refrigerated dryers. They increase demand for compressed air. So when should you use a desiccant...