The new chatbot appears to avoid discussion of politics, religion and race entirely and lives only on the Kik messaging application.
Microsoft is taking another stab at building a chatbot, several months after Tay, an earlier attempt, was taken offline when some internet users convinced it to spout racist and sexist comments.
The company’s second try, Zo, lives on the Kik messaging application.
Spotted over the weekend by a Microsoft-tracking blog, Zo appears to avoid discussion of politics, religion and race entirely. It also has a narrower release than Tay, which, because of its place on the public Twitter platform, had its meltdown in view of the entire internet.
Microsoft launched Tay, a millennial-imitating chatbot, in March. The bot, powered by machine-learning algorithms, was designed to…