Misleading
No Result
View All Result
  • Login
  • Register
Misleading
  • About Us
  • Log in
  • Don’t Mislead (Archive)
  • Privacy Policy
No Result
View All Result
Misleading
No Result
View All Result

Do big tech companies have a ‘duty of care’ for users? A new report says they do – but leaves out key details

February 4, 2025
in Missleading
Reading Time: 5 mins read
0 0
A A
0
Share on FacebookShare on Twitter
PV Productions/Shutterstock

Large social media companies should have to proactively remove harmful content from their platforms, undergo regular “risk assessments” and face hefty fines if they don’t comply, according to an independent review of online safety laws in Australia.

The federal government will today release the final report of the review conducted by experienced public servant Delia Rickard, more than three months after receiving it.

The review comes a few months after Meta announced it will stop using independent fact checkers to moderate content on Facebook, Instagram and Threads.

Rickard’s review contains 67 recommendations in total. If implemented, they would go a long way to making Australians safer from abusive content, cyberbullying and other potential harms encountered online. They would also align Australia to international jurisdictions and address many of the same problems targeted by the social media ban for young people.

However, the recommendations contain serious omissions. And with a federal election looming, the review is not likely to be acted upon until the next term of government.

Addressing online harms at the source

The review recommends imposing a “digital duty of care” on large social media companies.

The federal government has already committed to doing this. However, legislation to implement a digital duty of care has been on hold since November, with discussions overshadowed by the government’s social media ban for under 16s.

The digital duty of care would put the onus on tech companies to proactively address a range of specific harms on their platforms, such as child sexual exploitation and attacks based on gender, race or religion.

It would also provide several protections for Australians, including “easily accessible, simple and user-friendly” pathways to complain about harmful content. And it would position Australia alongside the United Kingdom and the European Union, which already have similar laws in place.

Online service providers would face civil penalties of 5% of global annual turnover or A$50 million (whichever is greater) for non-compliance with the duty of care.

Two new classes of harm – and expanded powers for the regulator

The recommendations also call for a decoupling of the Online Safety Act from the National Classification Scheme. That latter scheme legislates the classification of publications, films and computer games, providing ratings to guide consumers to make informed choices for selecting age-appropriate content.

This shift would create two new classes of harm: content that is “illegal and seriously harmful” and “legal but may be harmful”. This includes material dealing with “harmful practices” such as eating disorders and self-harm.

The review’s recommendations also include provisions for technology companies to undergo annual “risk assessments” and publish an annual “transparency report”.

The review also recommends adults experiencing cyber abuse, and children who are cyberbullied online, should wait only 24 hours following a complaint before the eSafety Commission orders a social media platform to remove the content in question. This is down from 48 hours.

It also recommends lowering the threshold for identifying “menacing, harassing, or seriously offensive” material to that which “an ordinary reasonable person” would conclude is likely to have an effect.

The review also calls for a new governance model for the eSafety Commission. This new model would empower the eSafety Commissioner to create and enforce “mandatory rules” (or codes) for duty of care compliance, including addressing online harms.

The need to tackle misinformation and disinformation

The recommendations are a step towards making the online world safer for everybody. Importantly, they would achieve this without the problems associated with the government’s social media ban for young people – including that it could violate children’s human rights.

Missing from the recommendations, however, is any mention of potential harms from online misinformation and disinformation.

Given the speed of online information sharing, and the potential for artificial intelligence (AI) tools to enable online harms, such as deepfake pornography, this is a crucial omission.

From vaccine safety to election campaigns, experts have raised ongoing concerns about the need to combat misinformation.

A 2024 report by the International Panel on the Information Environment found experts, globally, are most worried about “threats to the information environment posed by the owners of social media platforms”.

In January 2025, the Canadian Medical Association released a report showing people are increasingly seeking advice from “problematic sources”. At the same time technology companies are “blocking trusted news” and “profiting” from “pushing misinformation” on their platforms.

In Australia, the government’s proposed misinformation bill was scrapped in November last year due to concerns over potential censorship. But this has left people vulnerable to false information shared online in the lead-up to the federal election this year. As the Australian Institute of International Affairs said last month:

misinformation has increasingly permeated the public discourse and digital media in Australia.

An ongoing need for education and support

The recommendations also fail to provide guidance on further educational supports for navigating online spaces safely in the review.

The eSafety Commission currently provides many tools and resources for young people, parents, educators, and other Australians to support online safety. But it’s unclear if the change to a governance model for the commission to enact duty of care provisions would change this educational and support role.

The recommendations do highlight the need for “simple messaging” for people experiencing harm online to make complaints. But there is an ongoing need for educational strategies for people of all ages to prevent harm from occurring.

The Albanese government says it will respond to the review in due course. With a federal election only months away, it seems unlikely the recommendations will be acted on this term.

Whichever government is elected, it should prioritise guidance on educational supports and misinformation, along with adopting the review’s recommendations. Together, this would go a long way to keeping everyone safe online.

The Conversation

Lisa M. Given receives funding from the Australian Research Council. She is a Fellow of the Academy of the Social Sciences in Australia and the Association for Information Science and Technology, and an Affiliate of the International Panel on the Information Environment.

Previous Post

Temukan Link Situs Slot Server Hongkong No 1 dengan Jackpot Menggila

Next Post

Bronx students kick-off Black History Month with an HBCU experience

Related Posts

President Trump, Ya Really Don’t Need that Aircraft from Qatar, It’s Not A Good Look!
Don’t Mislead

President Trump, Ya Really Don’t Need that Aircraft from Qatar, It’s Not A Good Look!

May 29, 2025
Missleading

AI helps researchers detect disinformation using weaponized storytelling

May 29, 2025
Trump Nominates Matt Gaetz For Attorney General
Missleading

Trump Celebrates “Major Win” In Lawsuit Against Pulitzer Board

May 29, 2025
Missleading

What is AI slop? You are now seeing fake videos and photos in your social media feeds

May 28, 2025
The Misleading Reason Why President Biden Chose Not To Participate in Halftime or Pregame Interview of the  2024 Super Bowl
Don’t Mislead

The Misleading Reason Why President Biden Chose Not To Participate in Halftime or Pregame Interview of the 2024 Super Bowl

May 27, 2025
Veteran Washington Correspondent James Rosen Speaks about his Blackballing from the Biden Press Room
Don’t Mislead

Veteran Washington Correspondent James Rosen Speaks about his Blackballing from the Biden Press Room

May 27, 2025
Next Post
Bronx students kick-off Black History Month with an HBCU experience

Bronx students kick-off Black History Month with an HBCU experience

Neo-Nazi leader found guilty of plotting attack on Maryland power grid

Neo-Nazi leader found guilty of plotting attack on Maryland power grid

Please login to join discussion
Misleading

Misleading is your trusted source for uncovering fake news, analyzing misinformation, and educating readers about deceptive media tactics. Join the fight for truth today!

TRENDING

The Misleading Reason Why President Biden Chose Not To Participate in Halftime or Pregame Interview of the 2024 Super Bowl

Wichita, Kansas, Bishop Carroll High School Strips Valedictorian Status from Student Austin Tran after Speech, With No Controversy, Runs a Bit Long

President Trump, Ya Really Don’t Need that Aircraft from Qatar, It’s Not A Good Look!

LATEST

The California Interscholastic Federation (CIF) Has Implemented a New Rule in an Effort to Allow Transgender Students to Compete in High School Sports, Add Additional Medal Positions

President Trump, Ya Really Don’t Need that Aircraft from Qatar, It’s Not A Good Look!

AI helps researchers detect disinformation using weaponized storytelling

  • About Us
  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions

Copyright © 2025 Misleading.
Misleading is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms bellow to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • About Us
  • Log in
  • Don’t Mislead (Archive)
  • Privacy Policy

Copyright © 2025 Misleading.
Misleading is not responsible for the content of external sites.