Dr Belinda Barnet is a Senior Lecturer in Media at Swinburne University of Technology
"What is most interesting to me is that this case didn’t focus on the content. It focused on the harm caused to the person by the platforms.
In America, the platforms have mostly been shielded by Section 230, which means they don’t have to take responsibility for the content users post.
This case focused on harms instead. Maybe that’s the way forward here, too."
Belinda has not declared any conflicts of interest.
She is contactable on +61 403 833 455 or bbarnet@swin.edu.au. Also available on LinkedIn and X
--------
"The strongest long-term risk we see from children’s digital media use isn’t poor grades or more bullying from peers—it’s future addictive use, and that link is most pronounced for social media.
In our recent meta-analysis of longitudinal studies, social media stood out as the platform type most consistently associated with later problematic or addictive patterns of use. This suggests these platforms may uniquely reinforce behaviours that become harder for children to regulate over time.
Children are still developing the skills needed to manage attention, impulses, and rewards, yet many social media environments are designed to maximise engagement. That mismatch is likely part of the issue.
This verdict signals a broader shift toward recognising that platform design is not neutral. The priority now is for regulators and technology companies to take the lead in creating safer digital environments, with responsibility placed at the level where these systems are designed and controlled."
Sam's conflict of interest statement: "I receive funding from the Australian Government's National Health and Medical Research Council"
Sam is available during standard business hours and contactable on sam.teague@jcu.edu.au and +61 475 867 810
--------
"This verdict crystallises what has been building for several years: a fundamental shift in how we assign responsibility for social media harm.
For too long, the burden has fallen on individuals, parents, and families to resist platforms that, as the evidence in this trial showed, were consciously designed to be addictive. Features like infinite scroll, autoplay, and constant notifications were not incidental design choices; they were the architecture of addiction.
What this case establishes is a new accountability lens directed squarely at the platforms themselves. Lawyers argued that these companies engineered addiction, framing the platforms as a kind of “digital casino” too irresistible for young users to put down.
The question is no longer whether young people are harmed, or whether their parents should have done more to protect them — it is who built the machine that harmed them, and who will be made to pay for it."
Stan has not declared any conflicts of interest.
He is contactable on s.karanasios@uq.edu.au or +61 420 340 804
--------
Dr Christina Anthony is a Lecturer and Consumer Psychologist, an Expert in Consumer Behaviour and Behavioural Scientist in the Business School at the University of Sydney
“The verdict highlights a broader shift in how society understands the relationship between users and digital platforms.
The jury’s decision reflects an evolving awareness that digital environments can significantly shape behaviour, often in ways users may not fully recognise.
Many platform features discussed in the case — such as infinite scrolling and personalised content feeds — are grounded in well-established behavioural science principles related to attention, habit formation and reward.
From a consumer psychology perspective, these design elements are intended to enhance user experience and engagement, but they can also make it easier for patterns of frequent or prolonged use to develop, particularly among younger users.
The outcome of the case may encourage more balanced conversations about shared responsibility between individuals, families, policymakers and technology companies.
What’s emerging is a more nuanced understanding of digital consumption— one that recognises both the benefits of social connection and the importance of thoughtful, responsible design.”
The decision could prompt continued dialogue around transparency, user wellbeing and age-appropriate design, as the role of social media continues to evolve.”
Christina has not declared any conflicts of interest.
She is contactable on christina.anthony@sydney.edu.au
--------
Susan Sawyer is Professor of Adolescent Health at The University of Melbourne, Director of the Centre for Adolescent Health at Royal Children's Hospital and Group Leader of Adolescent Health at Murdoch Children's Research Institute
"This landmark case suggests the tide has finally turned against the ‘tech bros’ who, until now, have been protected by notions of ‘tech exceptionalism’ which have prevented the implementation of regulations that promote the online safety of young people.
No longer able to hide behind the notion that social media apps are simply neutral platforms, this court case centering on the use of technology features that drive online engagement, will have far-reaching consequences. Might it obviate the need for national laws like Australia’s Social Media Minimum Age Act? No, but it could be expected to make the acceptance of such laws in other countries far more likely."
Susan has not declared any conflicts of interest.
She is contactable via Tom Keeble on +61 3 9936 6460, +61 400 764 084 or tom.keeble@mcri.edu.au
--------
Dr Rob Nicholls is a Senior Research Associate at the University of Sydney
“This landmark verdict, along with many other similar lawsuits against social media companies, signals a shift in how courts view platform design as a set of choices that can carry real legal and social consequences. It opens the door to wider challenges against social media and other technology systems engineered to maximise engagement at the expense of user wellbeing.”
Rob has not declared any conflicts of interest.
He is contactable on +61 412 646 477 or rob.nicholls@sydney.edu.au
--------
Dr Joanne Gray is Chair of Discipline (Media and Communications) in the Faculty of Arts and Social Sciences at the University of Sydney
“A group of ordinary American citizens did what US regulators have so far failed to do. They looked at the evidence and found that Meta puts profits over user safety. This jury decision sends a clear message to all the Big Tech platforms: they need to do better, especially when it comes to keeping kids safe.”
Joanne has not declared any conflicts of interest.
Joanne is contactable on +61 410 562 236 or j.gray@sydney.edu.au
--------
Distinguished Professor Lisa Given is a Professor of Information Sciences, Director of the Centre for Human-AI Information Environments, and Director of the Social Change Enabling Impact Platform at RMIT University
"This is a landmark (bellwether, or test case) decision that will have far-reaching impact on social media platforms and other technology companies. What makes this lawsuit unique is that it’s the first successful case to argue that technology companies should be held accountable for platform design, including the features that deliver content to users. Other arguments have focused on the potential harm of social media content, particularly where young people access inappropriate content, so the focus on platform design is critical in this decision.
This lawsuit found Google and Meta negligent for social media platform design, which is a ground-breaking development in the global push to keep children - and other users - safe in online environments. Australia implemented legislation in December 2025 requiring social media companies to ensure account holders are over the age of 16. And many other countries across Europe and Asia, in particular, are considering similar laws. However, Australian media have reported cases of children under 16 retaining access to their accounts and accessing social media content through online platforms.
In this case, the plaintiff testified that she created Instagram and YouTube accounts when she was under 10 years of age. She used the platforms daily and argued the apps led her to suffer from anxiety, depression, and body dysmorphia, through the use of such features as beauty filters. Google and Meta argued that mental health concerns were complex and that it was not possible for people to be addicted to social media. The jury found for the plaintiff, noting that product design features harmed the plaintiff’s mental health. TikTok and Snap were also named as defendants in the suit, but settled before the trial.
While Google and Meta have said they will appeal the decision, one key question - if the finding is upheld - is whether the penalty of the judgment will be significant enough to force technology companies to change their design practices. While a $6 million verdict may sound significant, it is difficult to know if this will be a significant enough deterrent for two of the world’s largest technology companies, with capital spending exceeding $100 billion annually. However, this outcome will likely influence the thousands of other lawsuits pending in the United States, alone, setting an important precedent for future cases."
Lisa has not declared any conflicts of interest.
She is contactable on lisa.given2@rmit.edu.au or +61 458 340 908
--------
Dr Danielle Einstein is a Clinical Psychologist and Adjunct Fellow in the School of Psychological Sciences at Macquarie University and a researcher on social media use and anxiety
"This landmark victory signals a new era in technology use. It is an era in which we can no longer deny the harms created by platforms that are designed to create dependence and critically undermine mental health.
We can no longer mix education with addictive technology. It is time to ensure that technology use is intentional, that we recognise the way that any service engages and exploits anxiety.
This legal decision and the information that was revealed throughout the court proceedings should help Australia's e-safety commission, our schools and our doctors come to terms with the mental health crisis and blowout in health costs that we have seen in recent years.
We can no longer look away. Our work at Macquarie University formed a critical basis on which advocates around the world have moved forward."
Danielle has not declared any conflicts of interest.
She is contactable on +61 401 936 310 or danielle.einstein@mq.edu.au
Additional Note: Please feel free to pull any additional quotes from Danielle’s recent commentary on the trial, published on March 3 in The Lighthouse:
https://lighthouse.mq.edu.au/article/2026/march-2026/zuckerberg-borrows-big-tobacco-playbook-in-landmark-social-media-addiction-trial.-psychologists-arent-buying-it--------
Associate Professor Susannah Tye is a Principal Research Fellow at the Queensland Brain Institute within the Faculty of Health, Medicine and Behavioural Sciences at The University of Queensland
"This is an important turning point in how responsibility for youth mental health is understood and managed in the digital era. The legal focus on platform features such as infinite scroll, algorithmic amplification, and reward-driven engagement recognises the intent of Google and Meta to manipulate attention, without concern for the impact on individual users.
From a neuroscience perspective, this is consistent with evidence that these design features interact with developing brain circuits involved in reward processing and addiction, impulse control, motivation, and social evaluation. Adolescence is a period of heightened sensitivity to these signals, which can increase vulnerability to compulsive use and poor mental health outcomes.
This case reflects a clear shift in societal expectations, demonstrating that technology companies have a duty of care when designing products that can adversely affect young people. Safety-by-design, transparency, and regulatory oversight are now recognised as essential core requirements of digital innovation. This rightly shifts the burden of this responsibility back to developers, rather than leaving parents, educators, and young people to contend with systems designed to exploit their own neurobiology."
Susannah has not declared any conflicts of interest.
She is contactable on s.tye@uq.edu.au and available from Friday 27/3/26.
--------
Dr Rahat Masood is a Senior Lecturer in Cyber Security at UNSW Sydney
"This verdict is an important moment because it recognises that platform design, not just content, can play a role in shaping user behaviour. Features like infinite scroll can encourage people to stay longer, but the science linking these features directly to harm is still not clear. Most studies show correlation, not a clear cause and effect.
Human behaviour is complex, especially for young people, and is influenced by many factors beyond social media, such as family, environment, and mental health. So, we should be careful not to oversimplify the issue. That said, the case highlights a real gap: we need stronger, long-term research to understand how specific design choices affect users.
What this ruling should ideally trigger is not just legal accountability, but stronger collaboration between platforms, researchers, and policymakers to build a more robust evidence base. Only then can we develop interventions that are both scientifically grounded and effective in reducing harm while preserving the benefits of digital platforms."
Rahat has not declared any conflicts of interest.
She is available 9:30 am - 4:00 pm most days and is contactable on rahat.masood@unsw.edu.au
--------
Dr Lisa Archbold is a Lecturer in the School of Law at Queensland University of Technology
"The case against Meta and Google is a positive step forward in holding platforms accountable for their design choices. But it also demonstrates that we have failed to proactively regulate platforms, allowing an environment where profits have been prioritised over preventing harm to children and young people. Appropriate regulation of social media platforms in Australia is urgently needed to support children and young people’s participation and empowerment in digital spaces.
While Australia has been positioned at the forefront of regulating social media, the recent social media ban does not address fundamental concerns with the design features of social media, including infinite scrolling, targeted engagement and social rewards. Proposals such as a digital duty of care and fit-for-purpose privacy and data protection laws are far better placed to move towards a safer digital environment for children and young people.
Having a clear duty of care, with corresponding systemic risk identification and mitigation obligations, could help shift the responsibilities to platforms to improve their design features and allow a clearer path for individual redress in Australia."
Lisa has not declared any conflicts of interest.
She is contactable on l.archbold@qut.edu.au or +61 438 137 001
--------
Tama Leaver is a Professor of Internet Studies at Curtin University and a Chief Investigator in the ARC Centre of Excellence for the Digital Child
"The Los Angeles jury verdict tells Meta, Google, and even those big tech companies that weren’t the subject of this particular trial, that they need to do much better in making their platforms safer for children and younger users on their platforms.
By focusing on the design of these platforms, not the content users share on them, this lawsuit has effectively demanded that platforms design their offerings in a way that is more appropriate for younger people.
This isn’t about justifying bans or asking parents to police children’s use via parental surveillance tools, but rather demanding that all young users can have a differentiated and safer experience online, built into the very default settings of big tech platforms.
As a precedent, this outcome should force big tech companies to incorporate age-appropriate safety by design for younger users or risk facing endless lawsuits. It seems likely that the era of the infinite scroll is over."
Tama has not declared any conflicts of interest.
He is available 9:30am-11am, and after 2pm (WA time) today, and for most of tomorrow. Tama is contactable on t.leaver@curtin.edu.au, via Curtin’s Media team, or directly on +61 411 044 062.
--------
Dr Catherine Archer is a Senior Lecturer and researcher in social media at Edith Cowan University
"This ruling may be a game-changer for the social media giants. Their social licence to operate is now, more than ever, being severely tested.
Australia’s world-first experiment in a social media ban for under 16s, while definitely having many shortfalls, is also a clear signal that the companies are on notice.
However, whether this translates to real change on the platforms for a better and safer user experience remains to be seen.
The enshitification of the internet has developed over time. Citizens, businesses, and government now rely on these services every day worldwide and with large profits at play. The platforms will likely be working hard to contest this ruling and keep the status quo."
Catherine has not declared any conflicts of interest.
She is contactable on +61 438 872 728 or Catherine.archer@ecu.edu.au
--------
Dr Nipa Saha is a Lecturer in Media and Communications at Macquarie University
“This is an issue media and regulators have needed to confront for years, so in many ways this decision feels overdue. What stands out is that if a 20-year-old woman, with relatively advanced cognitive and self-regulation skills, can struggle with delayed gratification and develop addictive patterns of use, it raises serious concerns for younger users. Children and adolescents are far more vulnerable to the attention-grabbing, reward-driven design of social media platforms.
Research suggests that between 5 and 20 per cent of teenagers experience problematic or addictive social media use, while global estimates indicate nearly 14 per cent of children meet criteria for internet addiction. This goes beyond ‘screen time’ – it includes compulsive checking, anxiety when access is restricted, and impacts on mood, relationships and daily functioning. Some parents report extreme behavioural reactions when devices are taken away.
It is also important to note that while Australia has introduced a world-first ban on social media for children under 16, some highly engaging platforms, such as Roblox, sit outside this legislation. These platforms are designed to maximise time spent, with few limits on use. This exposes a broader regulatory gap, leaving many parents without the support needed to manage withdrawal-like behaviours and emotional impacts.”
Nipa has not declared any conflicts of interest.
Nipa is contactable on +61 469 299 414 or nipa.saha@mq.edu.au, and requests a message before calling. Nipa is available from 2pm today and from 3pm tomorrow (Friday).
--------
Dr Alexia Maddox is a Senior Lecturer in Pedagogy and Digital Futures and a sociologist of technology at La Trobe University
"This week has seen two landmark jury verdicts, delivered within days of each other in different US jurisdictions, that together send an unambiguous signal to the global technology industry. In Los Angeles, Meta and Google were found liable for designing platforms that addicted a young user through features like infinite scroll and algorithmic amplification — tools deliberately built to maximise engagement regardless of the cost to vulnerable users. In New Mexico, in a separate trial, Meta was found to have misled the public about safety while its platforms actively facilitated child sexual exploitation. The near-simultaneous timing is striking: these are not isolated findings but converging judgements across different courts, different harms, and different legal frameworks.
These verdicts should be read alongside eSafety's own findings, released just days ago, showing serious safeguarding gaps in AI companion apps widely used by Australian children. The problem is not confined to Instagram or YouTube — it extends across the digital environments children inhabit, and the harms follow the same pattern: design choices that prioritise engagement over child safety.
Australia's social media ban for under-16s feels decisive, but age restriction alone is not a safety solution. Across platforms, the architecture has been designed to maximise engagement at the expense of user wellbeing. That is a choice, not an inevitability. Legal liability is a blunt instrument. What we need are evidence-based regulatory and policy responses: robust safety-by-design standards, meaningful accountability, effective content moderation, and digital literacy embedded in communities and schools. Social harms require social and structural solutions — not just bans, and not just courtrooms."
Alexia has not declared any conflicts of interest.
She is unavailable today, but will be available tomorrow (Friday) and on the weekend. Contactable on a.maddox@latrobe.edu.au or +61 417 058 571
--------
Professor Daswin De Silva is a Professor of AI and Analytics and Director of AI Strategy at La Trobe University
"The two landmark lawsuits in Los Angeles and New Mexico formalise the urgent, global dialogue that social media platforms are engineered to be addictive, particularly for children, teenagers and young adults. Although the US Communications Decency Act protects these platforms from liability for user-generated content, these two cases argue that the companies are responsible for the defective design features that make this user-generated content addictive.
These features include infinite scroll, autoplay, personalised content, automated digital beauty filters, rewards for addictive behaviours, and the emotionally manipulative power imbalance of influencers over followers. Case evidence further indicates that social media platforms have regressed to ‘marketplaces for predators’, with children exposed to rapid targeting, direct solicitation and sexually explicit content.
These two rulings provide an early signal to thousands of pending cases that are likely to be settled out of court, but more importantly, they force the companies to drive major redesign work that dismantles addictive features and the power imbalance between influencers and followers.
Most countries are shifting towards laws enforcing ‘digital duty of care’ and online safety by design, such as the world-first Australian social media restriction for children under 16, the online safety amendment (social media minimum age) bill of 2024."
Daswin has not declared any conflicts of interest.
Daswin is contactable on +61 432 575 949 or d.desilva@latrobe.edu.au
--------
Associate Professor Tauel Harper is from the School of Media and Communications at Murdoch University and a member of the Information Ecologies Research Group
"The recent legal finding against Meta and Google is a long overdue intervention in public communication. Despite their lofty mission statements, these companies have only designed their platforms in the interest of profits and shareholders. If designed for the public good, social media can be an important part of democratic engagement; but when designed for profit, it becomes a cesspit of misinformation, extremist views and addictive design. Cases such as these are one way that these social media giants can be incentivised to create a better product, not just for young users but for adults as well."
Tauel has not declared any conflicts of interest.
Tauel is contactable on tauel.harper@murdoch.edu.au or +61 432 979 073
--------
Associate Professor Jo Orlando is from the School of Education at Western Sydney University
“The current anti-social media political climate globally means that the outcome and timing for this is important! It likely also influenced this outcome. We could start to see much-needed change in platform design."
Jo has declared she has no conflicts of interest.
Jo is contactable on +61 414 723 384
----------
Dr Dana McKay is Associate Dean of Interaction, Technology and Information in the School of Computing Technology at RMIT University
"Many industries have been required to make changes to their products to reduce negative social impacts: lower emission cars, warnings on alcohol bottles, and changes to cigarette formulation to make them less addictive.
Overnight, this change has come to social media platforms: Meta and Google have been found liable for the addictiveness of the algorithms their platforms use. This is not about the content these algorithms serve up; it is about the ways they serve it up, which are designed to keep us engaged, and have now been found to be harmfully addictive. This means that algorithms will need to change: we are likely to see an end to infinite scroll (already mooted in the EU) to nudge us to disengage.
While Meta and Google both plan to appeal, this lawsuit should ultimately result in changes to the algorithm. We might also see changes in the ways engagement works (less focus on likes and shares), and less promotion of highly emotive or inflammatory content. In any case, we might all hope that this lawsuit is the end of a long history of technodeterminism, and the beginning of a time where another large industry has to take responsibility for the harms it causes."
Dana has not declared any conflicts of interest.
Dana is contactable on +61 420 422 215, dana.mckay@rmit.edu.au
-----------
Dr Timothy Koskie is from the Mediated Trust Research Office at the University of Sydney
"Coupled with recent initiatives like Australia's social media age restrictions and the National Media Bargaining Incentive, this finding continues a shift in the public view away from highly problematic understandings of platform problems being an issue for individual responsibility and individual concern and towards a view that platforms themselves present systems and structures that are capable of presenting broad challenges that cannot be managed at the individual level.
The focus on developing platforms towards addictive behaviours can, however, obscure other larger-scale problems with how the platforms have upended our public structures, elevating the role of analytics and attention in ways that have undermined public trust in expertise and institutions, by algorithmically emphasising consistently negative content with tenuous connections to truthful representations of reality. This has led to citizens facing fatigue and news avoidance, as well as elevating a cast of problematic content producers.
Nevertheless, this provides the scope to utilise this moment to pivot away from these metrics-driven pushes that simultaneously feed the addictive qualities and potentially undermine cohesion towards a more robust media environment that fulfils the public's information needs and provides opportunities for cohesion.
This will prove challenging precisely because of the platform-mediated information environment and its emphasis on simplistic and negative framing, necessitating either buy-in from the platforms themselves or more direct attempts to govern how they operate."
Tim has not declared any conflicts of interest.
He is contactable on t.koskie@sydney.edu.au
--------
Dr Amelie Burgess is a Senior Lecturer in Marketing at Adelaide University
"This case is particularly instructive in signalling a broader shift in how harm in digital environments is understood. While notable, it does not exist in isolation, but rather reflects mounting concern around the role of social media platforms in shaping user behaviour through specific design features and affordances.
Increasingly, attention is being directed toward how these environments structure engagement, attention, and exposure in ways that can intensify susceptibility to influence. In this sense, harm is not simply a function of individual user choices or the content encountered, but emerges within digitally mediated systems that are designed to capture and sustain attention.
Cases like this one reflect a gradual movement away from the responsibilisation of consumers, where individuals are expected to self-regulate and critically evaluate persuasive content, toward recognising the structural conditions that underpin vulnerability.
At the same time, however, platforms and other stakeholders often continue to resist this shift by framing engagement as a matter of consumer choice and individual agency. In doing so, responsibility is subtly redirected back onto users, even as platforms operate through design features that are intended to maximise attention, prolong engagement, and ultimately drive monetisation.
This tension raises important ethical questions, particularly for consumers who may already be in vulnerable positions, where the notion of 'choice' is constrained by the very architectures that shape behaviour. In such contexts, what is presented as agency may in practice reflect a more limited and structured form of decision-making, calling into question how responsibility for harm should be distributed across digital environments."
Amelie has declared she has no conflicts of interest.
Amelia is contactable on Amelie.burgess@adelaide.edu.au