Twenty-one year after Facebook was launched, Australia’s 25 top news outlets have now a combined 27,6 million Facebook followers. Facebook is their main source of news, and they post more articles there now than ever before.
Our big data analysis analysed over three million posts by 25 Australian news publishers using Meta’s Content Library. We were interested in understanding how news content is distributed and how audiences interact with topics.
We were able to examine misinformation spreading through Facebook comments that had been de-identified. This included misinformation about floods, election integrity and the environment.
Data reveal the real-world impact of misinformation: It’s not just a digital problem, but it’s also linked to poor health results, a decline in public trust and significant harm for society.
Floods and misinformation about hydroxychloroquine
Take, for example, the false claim that the antimalarial drug Hydroxychloroquine is a viable COVID therapy.
As in the United States of America, media and political figures played a major role in spreading this idea. Clive Palmer, a mining billionaire who was then the leader of United Australia Party and a COVID drug advocate, promoted by actively hydroxychloroquine. He announced in March 2020 that he would finance trials and manufacture the drug.
He published a two-page ad in The Australian. Craig Kelly and George Christensen, both Federal Coalition MPs, also supported by co-authoring an open message advocating its usage.
We analyzed 7,000 public comments in response to 100 posts about hydroxychloroquine from selected media outlets during pandemic. We found that, contrary to the belief that the public is confined in echo chambers by a lack of debate online about COVID’s effectiveness, there was a robust exchange.
We have found that despite our fact-checking, facts alone are not enough to stop misinformation or conspiracy theory regarding hydroxychloroquine. The misinformation was not just directed at the drug but also towards the government, the media, and “big pharma”.
Public health studies have estimated that hydroxychloroquine was responsible for at least 17 000 deaths globally, but the real toll is probably higher.
Topic modelling highlighted the harm caused to individuals by misinformation. The secondary harms include distress, frustration, and worsening of symptoms due to the unavailability of the drug (due stockpiling).
We have seen how misinformation can damage public trust in non-government organizations and institutions. We saw again, following the 2022 flooding in Queensland and New South Wales that, despite fact-checking, misinformation about Red Cross charities flourished on the internet and was amplified through political commentary.
The misinformation, which we will not repeat here, led some people to change their donation behavior. For example, they bought gift cards to help flood victims instead of trusting in the Red Cross to distribute the much-needed money. This shows the damage misinformation can do to public trust and disaster response.
Misinformation is a sticky substance
Data also show the cyclical nature misinformation. This misinformation is “sticky” because it recurs at regular intervals, such as during elections. In one instance, false allegations were made against electoral administrators that voting officials had rigged election results by rubbing out pencil marks.
The data shows that misinformation persists online, even during the state and federal elections in 2023 including the Voice referendum.
Multiple debunking attempts from Electoral Commissioners, Fact-Checkers, Media and Social Media seem to have limited public engagement in comparison to a loud minority. We found that only 418 of the 60,000 sentences we examined on election topics in the last decade came from sources who were informed or official.
High-profile figures, such as Pall, have again played a key role in spreading this misinformation. This chart shows how sticky it is.
Misinformation must be eliminated
The study we conducted has important lessons for institutions and public figures. Politicians, in particular, should take the lead to curb misinformation as their false statements are amplified quickly by the public.
Both mainstream and social media play a role in reducing the spread of misinformation. Mainstream media, which is increasingly relied upon by Australians for their news, can provide accurate information and counter misinformation via online stories. Digital platforms are also able to curb algorithmic spreading and remove harmful content that can cause real harm.
The study provides evidence that audiences’ consumption patterns of news have changed over time. It is not clear if this is due either to news avoidance, or algorithmic changes. It is clear, however, that online audiences are more interested in celebrity, arts and lifestyle news than politics. This has led media outlets to focus on posting stories that entertain, rather than inform, from 2016 until 2024. This shift could make it difficult to combat misinformation by providing hard news facts.
The study concludes that, although fact-checking is valuable, it’s not the answer to all problems. Fighting misinformation requires an integrated approach that includes counter-messaging from trusted civic leaders, campaigns to improve media literacy and digital literacy, and public restraint when sharing unverified content.
Andrea Carson received funding from Meta for research on misinformation, and from ARC to examine political trust.
Justin Phillips, a Meta researcher, received funding to study misinformation.