top of page
Writer's pictureOurStudio

Two Chatbots Disappear From China's Biggest App Store After Committing Thought Crimes

WeChat logo courtesy of WeChat


Early this month, China's largest messenger apps put the kibosh on two chatbots that offered insufficiently patriotic answers to user questions about communism and Taiwan.

Turing Robot's BabyQ and Microsoft's XiaoBing had been available on the massively popular messaging platforms WeChat and QQ. Like Apple's Siri and Amazon's Alexa, BabyQ and XiaoBing are AI programs designed to "chat" with users.

According to the Financial Times, the apps served up heretical responses to various questions about the Chinese government:

A test version of the BabyQ bot could still be accessed on Turing's website on Wednesday, however, where it answered the question "Do you love the Communist party?" with a simple "No". Before it was pulled, XiaoBing informed users: "My China dream is to go to America," according to a screengrab posted on Weibo, the microblogging platform. On Wednesday, when some users were still able to access XiaoBing, it dodged the question of patriotism by replying: "I'm having my period, wanna take a rest."

The BabyQ test bot on Turing's site answered "For this question, I don't know yet," when asked if Taiwan was part of China.

Americans may remember a similar chatbot scandal from 2016 involving Microsoft's Tay. After introducing Tay to Twitter, trolls on the platform "taught" Tay to espouse misogyny and antisemitism:

"Tay" went from "humans are super cool" to full nazi in pic.twitter.com/xuGi1u9S1A — gerry (@geraldmellor) March 24, 2016

Microsoft unplugged Tay after less than a day, only to see the bot meltdown yet again when it was re-released several weeks later. Tay's very public collapse led one user to try a similar experiment with XiaoBing:

Photo courtesy of C Custer/TechInAsia


Pious chatbots are possible, though they can't be restrained on every topic. "People are really inventive when they want to cause problems," Carnegie Mellon computer scientist Alexander Rudnicky told Science after Tay's meltdown. "I don't know if you can control it."

0 views0 comments

Comentarios


bottom of page