Taiwan is at the forefront of the authoritarians’ war on information. Through multilateral symposia and an engaged civil society, it is developing new strategies to meet this direct challenge to democracy. Alison Hsiao and Nathan Liu show us what’s being done.
Held in September, the 2019 Global Cooperation and Training Framework (GCTF) on “Defending Democracy Through Media Literacy II” International Workshop was a follow-up to the successful cooperation between Taiwan and international partners last year. The co-hosts of this year’s event expanded to include not only the Taiwan Foundation for Democracy (TFD), the Ministry of Foreign Affairs (MOFA) and the American Institute in Taiwan (AIT), but also the National Democratic Institute (NDI), the International Republican Institute (IRI), the Japan-Taiwan Exchange Association, and the Swedish Trade and Investment Council in Taipei.
The return of the workshop shows that it was well received and its achievements were recognized, TFD Chairman Jia-chyuan Su said in his opening remarks, adding that it also attests to the fact that “the problem of disinformation has continued to be a challenge to democratic countries around the globe.”
Chairman Su said that in recent years, people in democracies worldwide have begun to become aware that the amount of resources that certain authoritarian regimes have poured into foreign propaganda and social media platforms is beyond our imagination.
“In the case of Hong Kong’s recent protests, we’ve witnessed that the Chinese government has used state powers to manipulate certain media outlets in order to obfuscate what really happened in the city. We’ve also seen that many groups, fan pages, and accounts on the major social media platforms have been spreading disinformation intentionally, aiming to influence how internet users perceive the protests in Hong Kong and sway the international public opinion,” he said.
The disinformation surrounding Hong Kong’s protests is only the tip of the iceberg, Su said, adding that Taiwan is the main battlefield where foreign forces engage in influence operations through disinformation to harm democratic values. According to the latest report released by V-Dem, a research institute based at the University of Gothenburg, Sweden, Taiwan is the most targeted among the 179 countries investigated for the spread of false information by foreign governments.
Also addressing the opening ceremony, Foreign Minister Joseph Wu said the spread of disinformation poses a serious threat to Taiwan’s democracy, especially as the presidential elections held early next year approach.
Japan’s Deputy Representative to Taiwan Nishiumi Shigehiro called for a balanced approach to tackle issues surrounding disinformation, adding that while democracy allows voters to choose their leaders based on correct information, we also have to be careful not to respond to disinformation at the expense of freedom of expression — also the very foundation of democracy.
Swedish Representative to Taiwan Håkan Jevrell cautioned that while information can be used rather harmlessly to influence our behavior, such as tempting a targeted population to buy certain consumer goods, authoritarian states are also using those tools to undermine our democracy.
NDI Vice President Shari Bryan likewise raised alarm over the challenges brought by technological advances that have fundamentally changed how we access and share information and exposed us to possible internal and external manipulations. “The threats and challenges are complex and evolving everyday and there is no one-size-fits-all solution,” she said. “We have to work together to equip our leaders in each community and country with the knowledge and the tools they need to assess their information environment … to counter the efforts of anti-democratic actors.”
American Institute in Taiwan Director William Brent Christensen said the U.S. National Security Strategy states that a geopolitical competition is currently being waged between free and repressive regimes and governments, adding that “nowhere is this truer than in the information battlefield.” The U.S., he said, is grappling with the spread of disinformation as foreign actors seek to use social media to influence elections, divide the American public, and undermine confidence in democratic institutions. “Taiwan is also on the frontline of this battle and faces the same challenges,” he said, adding that “responding to the challenge of disinformation is something no one society or government can do alone.”
Deputy Assistant Secretary of State Scott Busby said there are few better places than Taiwan to have discussions on defending democracy through media literacy, both because of the success Taiwan has had in building the right to respect democracy as well as the threat posed by outside forces.
“Taiwan’s 2020 elections are just a few short months away, and China once again seeks to use disinformation to undermine the vote, divide the people, and sow seed of doubt in democratic system,” he said. “China has invested heavily to develop more sophisticated ways to anonymously disseminate disinformation through a number of channels, including social media. As their malign methods evolved, the motivation remains the same: to weaken democracy and the freedom that citizens of Taiwan have come to enjoy after so many hard years of struggle.”
Strategies to counter hostile disinformation
In his keynote address, Jakub Kalensky, a senior fellow at the Eurasian Center at the Atlantic Council who focuses on disinformation campaigns initiated by Russia, said that while the penetration of Russia’s operations in Europe often uses online platforms, journalists have also been complicit in this campaign. “Fear” toward differences is what the operations are after, which agents of disinformation manipulate to stir public emotions.
To counter the threat of disinformation, Kalensky proposed four strategic measures: (1) effectively documenting the threats; (2) raising the public awareness; (3) repairing the weaknesses exploited by agents of disinformation; and (4) systematically punishing information aggressors to dissuade any further incidence, which is “not done often but has to be done otherwise we’ll never stop information aggression.”
Documenting threats is a daunting task and is “best done by governments since they have much bigger resources and since it is closely connected to security,” he said.
However, while already conducted by many organizations and government agencies, the task of monitoring has still not been performed sufficiently, he said. “We still don’t know how many channels the disinformers control, how many messages per day they spread, how many people they target, and because of that, we cannot even properly say whether there is an increase or decrease of a disinformation campaign in a particular country. We have impressions, but we lack solid data. We see fragments of the disinformation ecosystem, but we do not see the whole picture.”
On raising awareness, Kalensky called for “activity from every part of society — governments, journalists, NGOs, media, and private business,” each of which has different target audiences.
There are “systemic weaknesses” in our societies that need to be repaired, he said. Media literacy education “of the whole population will probably be more a role for the government, but also media can try and adhere to the highest possible journalistic standards,” Kalensky said. He also called on big tech companies to “stop promoting the disinformation-oriented outlets, de-rank them from search results, and label the content as toxic” in the social media environment.
But as we will always have some weaknesses, which “means the information aggressors will always have some weaknesses to exploit … it is necessary to start systematically punishing the disinformers,” he said, adding that this is not an appeal to create new rules or new laws, since in many cases there are already existing ones to be used.
“Individuals who are helping spread disinformation should be named and shamed by the media, by politicians, by NGOs, and by academics. The most aggressive and the most visible propagandists should be sanctioned,” he said. “Punishing the most visible propagandists and … individuals participating in spreading disinformation would send a clear signal that we do not tolerate the spreading of lies and hatred.”
Kalensky demanded equally strong measures from democratic countries and politicians when it comes to disinformation-oriented outlets, adding that access to them should be limited or cut off, providing them “with no accreditation, no access to press conferences, no statements for them, and no answers to their questions.” “These restrictions would make it clear that they are not media, as they themselves admit, but weapons in an information war,” he said.
Taiwan’s civil society has been proactively involved in efforts to combat disinformation, including some creative initiatives launched by young people who are deeply anxious about “filter bubbles” and their impact on elderly users. Those initiatives seek to both combat Chinese interference in the short term and to strengthen Taiwan’s information landscape for the future.
With the January elections approaching, more attention has been paid to disinformation. In previous TDB articles, Alison Hsiao introduced China’s disinformation campaign and efforts to combat it through its partnership with the U.S. International media such as Foreign Policy, the Financial Times and Reuters have also highlighted the problem with in-depth investigations.
One Chinese tactic is to influence Taiwanese media companies.
Anger over Chinese influence in the media led to a protest in June calling for the government to discipline “red media,” meaning outlets that are used by the Chinese Communist Party (CCP) to spread information. According to the organizers, more than 100,000 people attended the protest, held on Ketagalan Boulevard in Taipei. Protesters called for legislation requiring greater transparency in media funding and foreign connections. Those in attendance were especially concerned with disproportionate coverage of ostensibly pro-China politicians by certain media outlets. For example, the National Communications Commission (NCC) found that CTiTV dedicated 70% of its airtime during May on Kuomintang (KMT) mayor and presidential candidate Han Kuo-yu.
This spring, Taiwanese students began collaborating on projects to tackle this challenge. The Youth Combatting Fake News Front (青年抵制假新聞陣線) is a coalition of over 100 student organizations which agreed to oppose unchecked facts, biased media, and Chinese dis/misinformation. The movement started with a campaign to “take back the TV remote,” in which students refused to watch news that disproportionately covered pro-China stories. Since its creation, the Front has petitioned media companies and legislators to commit to reforms.
In an interview, founder I-jou Wu (吳奕柔), a 21-year-old student at National Taiwan University (NTU), said the Front is best positioned to engage with other young people and the public. For example, the Front engages with students through workshops, forums, and high school visits. Wu added that the goal is not only to better inform her peers about China’s information warfare but to equip them to discuss the issue with older relatives. She explained that young people are usually sensitive to the importance of freedom and civil liberties. In contrast, parents and grandparents that grew up before democratization may not naturally understand the severity of China’s actions. For Wu, the current conflict in Hong Kong epitomizes the potentially existential threat that China poses to democracy.
Check your facts
While the Front and the anti-red-media protests focus on China and its influence on traditional media, others have taken a broader perspective. Many organizations feel a responsibility to fight all dis/misinformation, not just that which originates in China.
Fact-checking organizations are among the most prominent combatants. The Taiwan FactCheck Center (TFC), for example, is collaborating with Facebook, LINE, Google, the National Education Radio Station, and the Chinese Television System (CTS). The Internet platforms are sources where news can be verified, and all the partners are avenues for distributing the fact-checked reports. The TFC also hosts workshops such as “Let’s Talk,” a dialogue series with young people.
Cofacts (真的假的) is a fact-checking platform developed in response to fake news shared in the closed messaging app LINE (a popular platform among Taiwanese). LINE users send suspicious links to the Cofacts account, and a bot will automatically reply if the article is already checked and in the Cofacts database. If not, a Cofacts volunteer will write a response. In the past year, Cofacts has received approximately 209,000 forwarded messages.
The results have been impressive. Nick Monaco, of the Institute for the Future and Oxford’s Computational Propaganda Project, told reporters that “I’m not just being flattering in talking about Cofacts being a really innovative bot and solution for disinformation.”
One insight from Cofacts is that dangerous fake news may not necessarily be political. Some websites can be generated by content farms seeking profit. False advertisement, celebrity gossip, and medical information may lack the malicious intent of political propaganda but can still be harmful to society. For example, Cofacts has encountered websites that are encouraging cancer patients to reject modern medicine. Other lessons about fake news have been discovered by researchers such as Austin Wang and Puma Shen, who are busy examining Cofacts’ open data.
Cofacts founder Johnson Liang emphasizes that people who spread fake news are often “digital immigrants,” meaning that they are new to the Internet, such as the elderly. Research by the National Development Council has revealed that older people tend to be the most susceptible to fake news. Disputing an elder’s post can be regarded as rude or awkward, especially because sharing articles may be a gesture of affection, a way to say “I’m thinking of you.” Many fact-checkers have tried to address this problem. Cofacts has designed its messages to be gentle and friendly. A different chatbot, Aunt Meiyu (美玉姨), uses the Cofacts database but can be added to a group chat and will automatically check for fake news; therefore, only the automated responses are rude, not real family members or friends. Trend Micro’s Dr. Message has collected its own database to combine the two approaches (of Cofacts and Meiyu). Rumor&Truth and MyGoPen are fact-checking websites targeted towards elders that have features to ease navigation.
The Fake News Cleaner (假新聞清潔劑) initiative uses another tactic to reach older neighbors. This group of volunteers hopes to cultivate media literacy through face-to-face interactions. After the 2018 referendums revealed divisions within Taiwanese society, the group recognized the need to break through echo chambers and bridge generational gaps. Fake News Cleaners go to public areas to engage strangers, especially the elderly, in conversations about fake news. They employ tactics to be approachable, including discussing health news rather than politics, designing messages that appeal to elders, and use games. For example, volunteers might ask passersby to identify problems in an article, with a useful prize as a reward. Their goal is to spread awareness with compassion, not condescension. Some organizations such as community associations, colleges, and senior centers have invited the Fake News Cleaners to present seminars about fake news. The organization says these seminars have been successful because they are in-person opportunities to use empathetic communication.
Combatting disinformation and fake news is a daunting task, one which Taiwanese civil society is committed to meeting head-on.
Feature photo: The 2019 Global Cooperation and Training Framework (GCTF) on “Defending Democracy Through Media Literacy II” International Workshop opened on Sept. 10, 2019.