A controversial law to combat misinformation and disinformation was introduced into the Australian parliament in September and last week saw another round of senate hearings into the proposal.
Although there is consensus that combatting mis- and disinformation is good, and that online platforms are curating content and culture in opaque ways, the bill is controversial because there are few supporters of this iteration. Concern about the scope of the proposed law and the definitions of what constitutes serious harm have been raised by an unlikely grouping of opponents, including digital and consumer rights groups, the Human Rights Commissioner, the Opposition and media outlets including News Corp.
As reform looms, there are four key measures that could improve our information environment now to better protect Australian democracy.
First, technology needs to be viewed as an ecosystem.
Second, regulate privacy.
Third, strengthen democracy.
Fourth, invest now in understanding the intersection of biology and technology.
A focus on the ecosystem offers potential to deliver real policy change, instead of trying to solve each of the national security threats and social harms of digital technologies individually. Understanding the design of technology shows us where the power lies and enables us to consider the impact on political and security decision-making. The technology ecosystem causes tension to three things essential to Australian democracy: trusted institutions, credible information and social inclusion.
In our own communities, committing to the key principles of democracy – especially respect for opposing ideas – is important.
Much of the conversation around mis- and disinformation as well as deepfakes and AI revolves around content moderation. It is important. While the creation of synthesised material has become much easier with ease of accessibility and multitude of AI apps available, the impact of disinformation and deepfakes is limited until they are distributed – so disseminating information is an essential piece of the puzzle. The role of media and political carveouts need to be considered further.
Increasing the transparency about existing industry efforts and their effectiveness would be a welcome addition, from content moderation process and policy changes to watermarking. Requiring tech companies to reverse the recent trend reducing research data access would support development of evidence based solutions.
In protecting privacy, meaningful legislation matters.
As a national security analyst, I didn’t expect to become a staunch privacy advocate. However, implementing meaningful privacy protections that disrupt the extractive, exploitative and pervasive data economy would have profound implications for not just privacy. It is the basis for algorithmic influence and curation, which shapes what people see, think and feel.
In addition to preserving privacy, it would reduce serious security vulnerabilities as well as the algorithmic influence and foreign interference capacity inherent in the current technology ecosystem. It limits microtargeting, or hyper personalisation, for exploitative advertising. Effective privacy protections are also essential to cyber security, as Australia’s Cyber Security Coordinator recently told me.
We also need to regulate privacy effectively before consumer neurotechnology arrives and before we all become the subject of “universal technological surveillance”, exploitable by the highest bidder.
Democracy is a national asset worth protecting. Australian democracy ranks well by global indices, but is vulnerable.
Women in politics and public life are 27 times more likely than their male counterparts to encounter sustained harassment and online abuse.
The six key principles of Australian democracy are important. In this context, three are highly relevant, pluralism, representative democracy and respect for and tolerance of opposing ideas (in parliament and in society).
So consider gender as an element of this. Australian women are close to equal representation in the federal parliament at 45 per cent, slightly lower at the state level. Yet women in politics and public life are 27 times more likely than their male counterparts to encounter sustained harassment and online abuse. Globally, 85 per cent of women have experienced online abuse.
Sustained gendered trolling is sexualised, violent, about appearance, age, traditional roles, supposed virtue and even fertility. Death and rape threats are common as are threats to kill a woman’s children. This kind online psychological violence is designed to silence women’s voices, to get women to leave public life, and significantly degrades representative democracy.
Locally, in our own communities, committing to the key principles of democracy – especially respect for opposing ideas – is important. And understanding more about how human biology and cognition interacts with technology is critical. Our information environment needs to work with human cognition and physiological responses, instead of “hacking them” with psychological tactics to increase user engagement. Using this knowledge to guide regulation, for example by mandating that platforms measure and improve user wellbeing, respect for alternative views and nervous system responses.
Rather than devolve into polarising social bubbles, asymmetrical power relationships between tech companies, governments and users, we need to create an information environment and cyber physical interactions that supports humanity to thrive.
All democracies are different. National security at its core is about the values of a society that you choose to protect and invest in. This includes physical safety and geopolitical security but also culture, diversity, acceptance and equality. We are now laying out the landscape for new technologies, like neurotech, to land in. It’s important then to get the digital backbone of our society right.