A legal reckoning for social media
A jury in Los Angeles has delivered what may prove to be a landmark moment in the global debate over children and social media.
Meta and YouTube have been found liable for the harm caused to a young woman who developed an addiction to their platforms as a child. Jurors concluded not only that harm occurred, but that these platforms were intentionally designed in ways that contributed to that harm. The companies were found to have acted with ‘malice, oppression, or fraud.’
This is not a marginal finding. It is a legal recognition of something that has long been argued but rarely established in such stark terms: that the harms associated with social media are not accidental by-products, but foreseeable consequences of design.
For years, technology companies have defended themselves on the basis that their platforms are neutral tools, and that responsibility for harm lies with users, parents, or wider society. This ruling cuts directly against that claim. It affirms that design choices – from infinite scroll to algorithmic content delivery – play a causal role in shaping behaviour and, in this case, addiction.
It is difficult to overstate the significance of that shift.
If upheld on appeal and if echoed in the many similar cases currently progressing through US courts, this could mark the beginning of a new phase of accountability for technology companies. One in which the question is no longer whether harm exists, but who is responsible for preventing it. For policymakers, the implications are immediate.
In the UK, the Government is currently consulting on measures including a potential ban on social media use for under-16s. While this reflects growing concern, the LA ruling highlights both the importance – and the limits – of such approaches. Restricting access may reduce exposure. But it does not address the underlying design of the systems themselves. Nor does it resolve the consequences for those already affected.
This is a critical point, and one that is often overlooked in public debate.
Addiction is not simply a matter of access. It is a condition shaped by repeated exposure to systems engineered to capture attention and reinforce behaviour. Once established, removing the source of that addiction does not automatically reverse its effects. In many cases, individuals remain vulnerable to substitution – shifting towards other platforms or behaviours that replicate similar reward mechanisms. In other words, prevention and remediation are distinct challenges. Both must be addressed.
This is one of the central arguments made in our recent report, Saving Scotland's Childhood: Children’s Online Rights and Ethics. The report sets out evidence that the current digital environment is contributing to rising levels of anxiety, depression, and developmental disruption among children and young people. It argues that this should be treated as a public health issue requiring systemic intervention.
The LA ruling reinforces that case. It demonstrates that the harms identified in the research are not hypothetical. They are being recognised in courtrooms. They are being linked to specific design practices. And they are now carrying financial and reputational consequences for the companies involved. But legal action, by its nature, is reactive. It occurs after harm has taken place. Public policy must act before that point.
Scotland has already incorporated the UN Convention on the Rights of the Child into domestic law. This creates not only a moral obligation but a legal duty to ensure that children’s environments are safe by design. The question now is whether that duty will be acted upon with the urgency the evidence demands.
The timing matters. A new school year begins in August. Without meaningful intervention, another cohort of children will enter an environment in which exposure to addictive digital systems is normalised, and in many cases, unavoidable. This should not be accepted as inevitable.
The policy tools exist. They include the creation of a dedicated regulator capable of setting and enforcing child-safe technology standards; the introduction of clear national guidance for parents and schools; restrictions on device use within educational settings; and a broader rebalancing of childhood towards real-world socialisation and emotional development.
Crucially, they must include support for those already affected – embedding mental health services, training educators to recognise signs of digital harm, and ensuring that children who are struggling are not left to navigate those challenges alone.
The LA verdict has been described as a "reckoning" for social media companies. That may be true. But it should also be understood as a warning. The evidence base is now substantial. The harms are increasingly visible. And the argument that nothing can be done is becoming less credible by the day.
The question for Scotland is not whether to act, but how quickly. Because while courts can assign responsibility after harm occurs, it is governments that determine whether that harm is allowed to continue.

