Latest in Gear

Image credit: BahadirTanriover via Getty Images

Chinese messaging app kills Microsoft's unpatriotic chatbot

It's probably the worst time to become a dissident chatbot in China.
473 Shares
Share
Tweet
Share
Save
BahadirTanriover via Getty Images

A popular Chinese messaging app had to pull down two chatbots, not because they turned into racist and sexist bots like Microsoft's Tay and Zo did, but because they became unpatriotic. Tencent, one of the country's tech giants, removed the bots called BabyQ and XiaoBing from its messaging service QQ, which has over 800 million subscribers. According to Financial Times, they began spewing out responses that could be interpreted as anti-China or anti-Communist Party. For instance, when Beijing-based Turing Robot's Baby Q was asked if it loves the Communist Party, it answered with a resounding "No."

A screencap posted on Chinese social network Weibo showed Microsoft-developed XiaoBing declaring that its "China dream is to go to America." The "girlfriend app" also brilliantly dodged a patriotic question by responding with: "I'm having my period, wanna take a rest." While these responses may seem like they can't hold a candle to Tay's racist and sexist tweets, they're the worst responses a chatbot could serve up in China. Especially now that authorities are tightening internet access even further and ramping up censorship leading to the Communist Party's leadership reshuffle this fall.

As Financial Times points out, this is the latest instance that brings the flaws of deep learning techniques to the surface. Tay, for instance, learned so much filth from Twitter that Microsoft had to pull it down after only 24 hours. If you teach a bot by feeding it info from the internet, it will learn from people's real conversations, which aren't always clean, civilized -- or patriotic.

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr