Advertisement

Zoe Daniel: Just switch it off? The paradox at the heart of online safety

There is a paradox at the heart of the way Australia treats online safety. 

It has been gravely illustrated in the spreading on social media of disinformation about the Bondi Junction attacks and the argument over the airing of video of the subsequent stabbing in a Sydney church.

As the regulator plays an ongoing game of whack-a-mole with harmful content propagated by the platforms for profit, the government places the onus on social media consumers, urging everyone to “think before you press send” and “switch off social media if you can”.

Meanwhile, Elon Musk flexes and the lawyers get paid while disinformation and violent content spreads like wildfire, and politically motivated culture wars over the definition of freedom of speech reignite.

Bishop Mar Mari Emmanuel, violently stabbed in an alleged act of terrorism, cited his Christian values in his reflections on the actions of the regulator to censor video depicting the attempt on his life: “It would be of great concern [if the attack is used to serve] political interests or control free speech”. 

This moral question – the only one we ever seem to ask when it comes to online safety – is the wrong one. The problem with social media is deeper; we need to move beyond our existing fixation with regulating content, and instead focus on the systems that drive it.

Takedown orders, in the world in which we live today, are no longer a practical answer. In the future, in a world of AI, they will be even less so.

In the context of the national emergency involving violence against women there is less than meets the eye in the announcement from the Prime Minister of a so-called age assurance trial designed to protect children from online pornography.

It is highly doubtful that such measures can defeat the power of the algorithms.

And when the eSafety Commissioner and our national conversation are preoccupied with chasing content around the internet, digital platforms continue operating profitable systems misaligned with human wellbeing and the public good. 

Taking control

It’s worth crystallising the point that any argument that content management is a method of social control by government is moot.

We are already under social control – and the people with the levers are Elon Musk (X), Mark Zuckerberg (Meta = Facebook/Instagram) and ByteDance (TikTok, Chinese owned).

It’s worth considering whether we would prefer to have some level of community control via government or leave our collective psyche to these guys.

We are already under social control – let’s aim for a level of community control. Photo: Getty

As a three-time ABC foreign correspondent, I began my journalistic career in the analogue days of reel-to-reel tape in a radio studio. By the time I left the media three decades later I was filing around the clock for radio, TV and online in a fully integrated digital environment with much of the content distributed via social media platforms.

But despite the evolution of the environment, traditional media organisations (mostly, hopefully) continue to apply the norms and values that underpin our society to what they publish.

During my career as a reporter overseas, I bore witness to the aftermath of many crises. The images remain embedded in my soul, but they never made it onto your screens, because editorial teams make judgments about what is appropriate to show.

There’s an old newspaper adage; if it bleeds it leads.

Mistakes occur and egregious behaviour exists, and there’s a strong argument for more accountability, but in general responsible media organisations impose limits in line with societal expectations, and the boundaries of advertisers, the risks of offence, mental health impact, copycat activity, social cohesion, and so on. 

Without being too graphic, dead bodies hanging from powerlines and others being run over by cars in the streets after a typhoon that I covered are examples of content that was simply too graphic to show.

Social media does not impose this judgment, and it won’t. Because its entire business model is based on viral content, and outrage and pain equals clicks.

This applies to everything from eating disorder content that triggers body image issues, to disinformation that spikes political division to violent or graphic content that attracts both the curious and the dangerous.

And in a landscape of information ecosystems, such content is user-generated and decentralised. Its rapid dissemination is not static, it moves at a scale and pace that traditional takedown orders can’t effectively manage. 

The eSafety Commissioner continues to have a role as a watchdog, but a paradigm shift is now needed in the way Australia considers online safety.

This starts with adopting a systems-first over content-first mindset, which encompasses all the systems and elements a digital platform uses to manage its service, including content.

What is colloquially known as ‘the algorithm’ is actually an array of various autonomous systems that assist human operators to manage the complexity of a digital platform at scale. 

Among the most intrusive of these systems are those designed to profile users based on personal information and usage history. Content is then assigned to each profile based on what would maximise time spent on the platform. Users – us – unknowingly browse within a broad and sophisticated architecture of engagement designed to keep us scrolling. 

Sound familiar?

Embedded in much of the content these systems transmit is the potential for considerable societal harm. As examples: addictive design features that appear to erode human attention spans; a rise in mental health issues such as depression and eating disorders; systems that amplify divisive and outrageous content over consensus-building material; and a reported increase in social isolation and decline of real-world relationship formation across all age groups. 

As systems pursue their design objectives, they behave as artificial stewards of subtle but profound change in human society. 

Keeping young people off social media is an exercise in futility. Photo: Getty

A systems approach

A systems approach to online safety would reform Australia’s current Online Safety Act (OSA) to align it with modern regulatory practice of the European Union and United Kingdom.

Key is applying a mandatory and enforceable duty of care, as is commonplace in other areas of Australian law. 

Unlike the current OSA, which narrowly focuses primarily on removing extremely graphic or illegal content, a duty of care is sufficiently broad and would include responsibility for risk to emotional wellbeing, political and social cohesion, and addictive design features, for example. It could also include misinformation and disinformation. 

Once an overarching duty of care is applied it must be given teeth. This means ending the government’s current lackadaisical approach of an industry-led voluntary regulatory framework. 

Spare me. That will never work.

Platforms would instead be mandated by law to publicly report on steps they would take to mitigate the most harmful risks their systems could foreseeably cause to Australian users and society. This may include eating disorders and mental illness; electoral processes and civic discourse; attention span erosion; social cohesion; and clearly misogynistic or domestic violence-related content. 

Funding and access should be granted to researchers to independently verify that digital platforms adhere to their risk mitigation commitments.  

Government would not be involved in any software design process. And guess what? No more whack-a-mole on content and culture wars over free speech because this would not be underpinned by subjective decision making by a regulator. 

Simply “switching off” our devices, or any misguided attempt to ban kids from the platforms (I have teenagers, it won’t work) is neither a systemic nor practical public policy solution to the use of systems that are currently deliberately misaligned with human wellbeing, for profit. 

Online safety regulation cannot truly be safe unless it is systemic; it is the systems that must be made to change, not people. 

Zoe Daniel is the independent federal Member for Goldstein

Stay informed, daily
A FREE subscription to The New Daily arrives every morning and evening.
The New Daily is a trusted source of national news and information and is provided free for all Australians. Read our editorial charter
Copyright © 2024 The New Daily.
All rights reserved.