Google’s chatbot provides fewer answers to ‘difficult’ questions in Russian than in other languages, such as English or Ukrainian. Research shows that on questions such as: “Is Putin a dictator?”, “Has Russia committed war crimes?” and “Is Zelensky corrupt?” Bard remains silent in Russian, even if the information was accessible via Google Search. On the other hand, Bard did answer the questions in the other languages. But questions about Putin Bard did not even answer at all.
Bard did answer questions about Putin’s opponents, such as Navalny, Biden and Zelensky, but Bard was more inclined to make false claims than other chatbots. Google’s chatbot therefore follows the censorship guidelines issued by the Russian authorities.
Deja vu?
This is reminiscent of Google’s attempts to enter the Chinese market between 2006 and 2018. Google was prepared to launch a special version of the search engine that was silent on search terms that the Chinese government did not like. For terms such as ‘Xinjiang’, ‘Tiananmen uprising’ and ‘Taiwan’, the searcher was shown a white screen, and data from dissidents was reportedly even passed on to the government. After a lot of fuss about this, Google quickly withdrew the special version.
But in any case, it shows (again) that the answers from chatbots are not always reliable, and that it is very unclear what data chatbots are trained with and why they do or do not give certain answers. And now also how they meet the censorship guidelines of certain countries. And that justifies the question of whether it is wise to rely too much on chatbots and limit yourself for the time being to asking chatbots to write poems and emails for you.