قالب وردپرس درنا توس
Home / Innovative / Taylor Swift threatened to sue Microsoft over its racist Tay chatbot

Taylor Swift threatened to sue Microsoft over its racist Tay chatbot



Taylor Swift's lawyers threatened to sue Microsoft over the company's Tay chatbot. The Guardian reports that a new book by Microsoft President Brad Smith revealing attorneys for Taylor Swift were not happy with the company using the name Tay for its chatbot. Microsoft's chatbot was originally designed to hold conversations with teens via social media networks, but Twitter users turned it into a racist chatbot in less than a day.

Smith checked his emails during a vacation and found out that Taylor Swift's team required a name change for Tay chatbot. "An email had just come from a Beverly Hills lawyer who introduced himself by telling me: & # 39; We represent Taylor Swift, on whose behalf this is addressed to you. & # 39;" The attorneys argued that "the use of the name Tay created a false and misleading association between the popular singer and our chatbot, and that it violated federal and state laws," Smith says in Tools and Weapons a new book on how technology both strengthens and threatens us.

It's not clear when Taylor Swift's lawyers contacted Microsoft about the Tay name, but they probably weren't happy about the kind of misogynistic, racist and Hitler-promoting junk it posted on Twitter. quickly apologized for the offensive material published by its AI bot and pulled out of contact at Tay after less than 24 hours.


Source link