Hashtag

China's disinformation campaign is real. We need better defences against state-based cyberattacks

  • Written by James Martin, Associate Professor in Criminology, Swinburne University of Technology

The Australian government recently announced plans to establish the country’s first taskforce devoted to fighting disinformation campaigns, under the Department of Foreign Affairs and Trade (DFAT).

Last week, Foreign Minister Marise Payne accused China and Russia of “using the pandemic to undermine liberal democracy” by spreading disinformation to manipulate social media debate.

“Where we see disinformation, whether it’s here, whether it’s in the Pacific, whether it’s in Southeast Asia, where it affects our region’s interests and our values, then we will be shining a light on it,” Payne said.

In her speech to Canberra’s National Security College, she claimed Australia is going through an “infodemic”. But is it really? And if so, what can be done about it?

170,000 accounts removed, but how many missed?

Disinformation campaigns are coordinated attempts to spread false narratives, fake news and conspiracy theories. They’re characterised by repetitive narratives seemingly emanating from a variety of sources. These narratives are made even more believable when republished by trusted friends, family, community figures or political leaders.

Disinformation campaigns exist along a continuum of different cyber warfare techniques, including the massive state-sponsored cyberattacks targeting Australian government institutions and businesses. These sustained attacks reported on Friday were also purportedly emanating from China.

Social media networks such as Twitter and Facebook provide a perfect forum for disinformation campaigns. They’re easily accessible to foreign actors, who can create fake accounts to spread false but seemingly credible stories.

Read more: Meet ‘Sara’, ‘Sharon’ and 'Mel': why people spreading coronavirus anxiety on Twitter might actually be bots

Earlier this month, Twitter removed more than 170,000 accounts connected to state-run propaganda operations based in China, Russia and Turkey. Of these, about 150,000 were reportedly “amplifier” accounts boosting content.

According to a report published this month by the Australian Strategic Policy Institute (ASPI), a “persistent, large-scale influence campaign linked to Chinese state actors” has been targeting Chinese-speaking people outside China.

The campaign is allegedly aimed at swaying online debate surrounding the COVID-19 pandemic and the Hong Kong protests, among other key issues.

Twitter is banned in China, so there would be minimal opportunity for the Chinese government to develop and embed troll accounts into local Twitter networks. Instead, China has likely hacked, stolen or purchased legitimate accounts.

Twitter hasn’t revealed exactly how it detected the state-sponsored accounts, presumably because this would give other states a “how-to” guide on circumventing the platform’s security barriers.

But according to a New York Times report, one giveaway is when a user logs into many different accounts from the same web address. Twitter has also suggested unblocked accounts posting from China may be acting maliciously with government approval.

China's disinformation campaign is real. We need better defences against state-based cyberattacks Earlier in June, Foreign Minister Marise Payne accused China of spreading disinformation during the coronavirus pandemic. She said Australia would push for the World Health Organisation to better protect the country’s national interests. JOEL CARRETT/AAP

Information warfare is a growing threat

Australia’s Department of Home Affairs has warned there’s a “realistic prospect” foreign actors could meddle in Australian politics, including in the next federal election – unless steps are taken to prevent this.

The government has warned of this as a future threat. But based on the available evidence, we contend disinformation is already being used to manipulate public debate in Australia.

A University of Oxford report published last year suggested organised social media manipulation campaigns have occurred in 70 countries, including Australia.

Earlier this week, analysts at ASPI reiterated how Islamophobic and nationalist content was intentionally spread online during last year’s election campaign.

Perhaps the most infamous example of a large-scale disinformation campaign came from Russia in 2016, when a coordinated campaign was deployed to meddle with the US presidential election. Like Russia, China now appears to be investing substantial resources into disinformation campaigns.

Australia should expect to see further complex attacks conducted by both foreign and internal agents. These may be foreign state-sponsored campaigns, or dirty tactics used on the electoral campaign trail.

During last summer’s horrific bushfires, a large number of Twitter bot accounts were found posting #ArsonAttack, to perpetuate the idea the fires were largely attributable to arson, rather than climate change. The false claims were taken up by News Corp publications, which then influenced debate surrounding the crisis.

Read more: Bushfires, bots and arson claims: Australia flung in the global disinformation spotlight

Such claims sow confusion among the public. They increase political polarisation, and erode trust in media and political institutions.

The best defence is a collective one

While we can hope Twitter builds on efforts to detect malicious accounts that spread lies, we can’t assume state-sponsored actors will sit back and do nothing in response. Governments have invested too much into such attacks, and campaigns have proven successful.

The most readily available means of defence, as per most contemporary cybercrime, is user education. Social media users of all political persuasions should be aware what they’re seeing online may not be accurate, and should be viewed with a critical eye.

Some of us are better at differentiating between what is real and fake online, and can help filter out content that’s untrustworthy, unverified or plain wrong. Simple ways to do this include stating the facts (without specifically focusing on the myths), and offering explanations that coincide with the other’s preexisting beliefs.

It’s also important to remember how little actions such as “liking” and “retweeting” content can further spread disinformation, regardless of intent.

Also, while the above steps help they’re unlikely to completely insulate Australia from the potentially disastrous effects of future disinformation campaigns. We’ll need new solutions from both government and private industry.

Ideally, we’d like to see government regulation around disinformation. And although this hasn’t happened yet, the announcement of a government-run disinformation taskforce is at least one step in the right direction.

Read more: Coronavirus anti-vaxxers aren’t a huge threat yet. How do we keep it that way?

Authors: James Martin, Associate Professor in Criminology, Swinburne University of Technology

Read more https://theconversation.com/chinas-disinformation-campaign-is-real-we-need-better-defences-against-state-based-cyberattacks-141044

Health & Wellness

How to Maintain Your Oral Health During Stressful Times

Hashtag.net.au - avatar Hashtag.net.au

Life can be a bit like an Australian outback road—full of twists, turns, and the occasional bump. During these hectic, stressful times, it’s all too easy to let your oral health fall by the wayside...

The impact of visual art on mental health and productivity

Hashtag.net.au - avatar Hashtag.net.au

In a world where stress and anxiety often take center stage, the presence of visual art in our daily environments offers a quiet yet profound counterbalance. Posters and paintings are not just decor...

Understanding the Environmental and Health Impacts of Waste Disposal: Essential Insights

Hashtag.net.au - avatar Hashtag.net.au

🌎♻️Explore the health & environmental impacts of waste disposal. Dive into essential insights💡for a sustainable future! #WasteManagement #Health🌿🌍 Waste disposal is an inevitable aspect of mode...