She was unhappy with the name of its chatbot Tay, meant to interact with 18 to 24-year-olds online, because it was similar to hers. If you don't remember TayTweets, it's the Twitter chatbot that ...
Microsoft has admitted it faces some "difficult" challenges in AI design after its chatbot "Tay" had an offensive meltdown on social media. Microsoft issued an apology in a blog post on Friday ...
The chatbot in question was called XiaoIce, and was designed to converse with real people on social media. However, its US name was Tay. In his forthcoming book "Tools and Weapons," per the ...