microsoft

Can an artificial intelligence be racist?

March 24, 2016

Have we projected the worst of humanity onto our robotic brothers?

It seems like we are always trying to make robots act more like human, but have we gone too far? Have we projected the worst of humanity onto our robotic brothers? If Microsoft's new AI, Tay, is any indication, the answer is yes.

According to Microsoft: "Tay is an artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you."

The AI was then given its own Twitter account and encouraged others to ask it questions and interact with it.

According to TechCrunch: "Tay is meant to engage in playful conversations, and can currently handle a variety of tasks. For example, you can ask Tay for a joke, play a game with Tay, ask for a story, send a picture to receive a comment back, ask for your horoscope and more. Plus, Microsoft says the bot will get smarter the more you interact with it via chat, making for an increasingly personalized experience as time goes on."

But here's where it gets complicated. According to Business Insider: "Tay proved a smash hit with racists, trolls, and online troublemakers, who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide.

Microsoft has now taken Tay offline for "upgrades," and it is deleting some of the worst tweets — though many still remain. It's important to note that Tay's racism is not a product of Microsoft or of Tay itself. Tay is simply a piece of software that is trying to learn how humans talk in a conversation. Tay doesn't even know it exists, or what racism is. The reason it spouted garbage is because racist humans on Twitter quickly spotted a vulnerability — that Tay didn't understand what it was talking about — and exploited it."

To learn more, read "Microsoft is deleting its AI chatbot's incredibly racist tweets" from Business Insider.

About the Author

Alexis Gajewski | Senior Content Strategist

Alexis Gajewski has over 15 years of experience in the maintenance, reliability, operations, and manufacturing space. She joined Plant Services in 2008 and works to bring readers the news, insight, and information they need to make the right decisions for their plants. Alexis also authors “The Lighter Side of Manufacturing,” a blog that highlights the fun and innovative advances in the industrial sector. 

Sponsored Recommendations

Reduce engineering time by 50%

March 28, 2024
Learn how smart value chain applications are made possible by moving from manually-intensive CAD-based drafting packages to modern CAE software.

Filter Monitoring with Rittal's Blue e Air Conditioner

March 28, 2024
Steve Sullivan, Training Supervisor for Rittal North America, provides an overview of the filter monitoring capabilities of the Blue e line of industrial air conditioners.

Limitations of MERV Ratings for Dust Collector Filters

Feb. 23, 2024
It can be complicated and confusing to select the safest and most efficient dust collector filters for your facility. For the HVAC industry, MERV ratings are king. But MERV ratings...

The Importance of Air-To-Cloth Ratio when Selecting Dust Collector Filters

Feb. 23, 2024
Selecting the right filter cartridges for your application can be complicated. There are a lot of things to evaluate and consider...like air-to-cloth ratio. When your filters ...