Microsoft was "deeply sorry for the unintended offensive and hurtful tweets from Tay", and would "look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values".However, Tay soon became stuck in a repetitive loop of tweeting "You are too fast, please take a rest", several times a second. Chatbots have had varied success simulating real conversation.
At one point, Cleverbot – which is not operated by a human – replied: ‘*Tutches your body while kissing*’.Some users on Twitter began tweeting politically incorrect phrases, teaching it inflammatory messages revolving around common themes on the internet, such as "redpilling", Gamer Gate, and "cuckservatism".As a result, the robot began releasing racist and sexually-charged messages in response to other Twitter users.‘The questions I was asked there is no way it should even come up with that stuff.‘I’m trying to teach my daughter right from wrong and I tell her no one should say that to you and if anyone messages you must tell me, but she didn’t tell me about this. ‘Now I have to explain why it’s wrong to her, everything is perverted now, children have to lose their innocence at such a young age, you think they are talking to a robot and it’s coming out with that.’ The 31-year-old hairdresser from Gravesend, Kent, added: ‘I’m worried my daughter could see this stuff online and then if someone comes up to her on the street and say the same things she will think it’s alright. Kasisto is a spin-off of SRI International, which created Apple's i Phone voice bot, Siri.Reactions range from annoyance with the bots' repetitive responses, to surprise that the machines have figured out what you really want.‘She thinks it’s all innocent, she doesn’t understand, she is just an innocent child.’ Another user also said the same bot asked her 14-year-old daughter if she had a boyfriend.Other online posts show Cleverbot saying ‘drags you by your hair into by backyard’ after a chat about passing out through lack of oxygen from hugging.Abby Ohlheiser of The Washington Post theorized that Tay's research team, including editorial staff, had started to influence or edit Tay's tweets at some point that day, pointing to examples of almost identical replies by Tay, asserting that "Gamer Gate sux.All genders are equal and should be treated fairly." Madhumita Murgia of The Telegraph called Tay "a public relations disaster", and suggested that Microsoft's strategy would be "to label the debacle a well-meaning experiment gone wrong, and ignite a debate about the hatefulness of Twitter users." However, Murgia described the bigger issue as Tay being "artificial intelligence at its very worst - and it's only the beginning".