Toglenn via Wikimedia Commons
Big Reputation

Taylor Swift Reportedly Threatened Microsoft Over Racist Chatbot

byDan Robitzski
9. 10. 19
Toglenn via Wikimedia Commons

Apparently "Tay" was too close to Swift's name for comfort.

Namesake

Remember the 2016 internet horror story where Microsoft created a Twitter bot, named Tay, and had to shut it down within a single day after internet trolls turned it into a hate-spewing neo-Nazi?

According to Microsoft President Brad Smith’s new book, pop star Taylor Swift took issue with the debacle. As reported by Gizmodo, Swift’s lawyers apparently got in touch with Microsoft to complain that Tay’s name created a “false and misleading association between the popular singer and our chatbot.”

Rebranding

According to the book, the reason that Microsoft’s next chatbot — which was barred from discussing topics like politics and race — got a name change was to avoid going to court.

Gizmodo reports that Microsoft’s copyright lawyers weren’t quite convinced by the purported connection between Tay the chatbot and Taylor Swift the pop megastar. But they decided to avoid a legal battle and dubbed the replacement chatbot “Zo” instead. So far, no other celebrities have complained.

Advertisement

READ MORE: Taylor Swift Threatened Legal Action Against Microsoft Over Racist and Genocidal Chatbot Tay [Gizmodo]

More on Microsoft AI: Microsoft Is Worried Its AI Will Go Rogue and Hurt Its Reputation


Care about supporting clean energy adoption? Find out how much money (and planet!) you could save by switching to solar power at UnderstandSolar.com. By signing up through this link, Futurism.com may receive a small commission.

Share This Article

Copyright ©, Camden Media Inc All Rights Reserved. See our User Agreement, Privacy Policy and Data Use Policy. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.