The recent Online Safety Act passed by the UK Parliament is a decent opening move, but it’s like putting up a fence after the horse has bolted. If we are serious about protecting and empowering the next generation, we need to focus not just on rules, but on educating.

In recent years, we’ve seen a wave of prohibitionist measures in schools, from outright bans on mobile phones in classrooms in England and parts of Scotland, to tighter controls on device use and a significant roll back in digital policies in countries such as Sweden, France, Italy, and the Netherlands.

While often framed as safeguarding moves, these restrictions can unintentionally limit opportunities for pupils to develop responsible, informed, and creative digital practices. Rather than equipping young people with the tools to navigate technology, such policies risk reinforcing a “keep it away” mindset, which links directly to the wider debate on outright bans, like the one now emerging in Australia, who are about to put prohibition to the test.

From 10 December 2025, it will introduce “world-first” rules banning under-16s from creating or holding social media accounts. As details are being negotiated, many Australian media and internet researchers have expressed disappointment and concern around the lack of forward vision. A major policy shift offers an opportunity, but instead of reimagining a safer online future for children, policymakers are putting social media on the “high shelf”—out of reach, but still in plain sight—and offering no meaningful alternatives.

We have seen time and time again that prohibitionist measures don’t work: preaching celibacy does not stop teenage pregnancy; banning alcohol does not end drinking. Prohibition often drives activity into darker, unregulated spaces with fewer safeguards and less oversight. Much of what happens in young people’s online lives is already invisible to the adults who care for them; regulation without education risks widening that gap.

By no means are we against this legislation; many would agree that shielding young people from harmful content matters and as a society we share a duty of care to safeguard our young people. Professor Sonia Livingstone of the London School of Economics warns that the online world is as full of hazards as it is of opportunities. Even more worrying is the fact that the risks are growing. In January 2025, Facebook CEO Mark Zuckerberg announced the removal of third-party fact-checkers, increasing the risk of misinformation. Ofcom’s Children and Parents: Media Use and Attitudes Report 2025 found a drop in 16–17-year-olds’ confidence in spotting false information online. It also shows children are logging on younger, staying online longer, and stumbling across content no parent would choose for them. In a world of deepfakes and disinformation, critical digital literacy is no longer optional, it is essential.

With the legislation, we welcome the subtle shift of responsibility from parents and teachers to the big tech companies themselves and believe that these platforms should be held accountable. Nevertheless, restriction alone is a timid tool. It may deliver short-term relief, but what happens when those children reach adulthood still lacking the skills, resilience, and judgement they need to adequately function as citizens in a digital society? If our response stops at prohibition, we are simply postponing the inevitable. Sooner or later, those young people will enter the digital world without the skills or judgement to navigate it safely.

At the University of Glasgow, working with Glasgow City Council, we’re exploring how teachers and student-teachers are using technology, the benefits and challenges they are encountering with tech in schools, and how confident they feel using tech and supporting learners in using it. The recently published report reaches a similar conclusion: access alone isn’t enough. While confidence in using tools is growing, there is a long road to engaging learners in effective, responsible and critical use of tech. The report stresses the importance of embedding digital literacy across the curriculum, not just to explore new teaching methods, but to develop pupils’ ability to question information, who controls platforms, how algorithms shape what they see, and whose voices are amplified or silenced.

We also found stark digital divides, where pupils’ and teachers’ skills vary dramatically, and confidence in promoting online safety remains low. Parents, teachers and students all need more support.

In Scotland, there is a positive uptake in this regard. For instance, the Digital Schools Award Scotland promotes whole-school approaches to safe, responsible and creative technology use. Further national guidance urges schools to prepare pupils to recognise risks, protect their data, and respond to online threats. However, support on how this is enacted in practice and how to prepare and support teachers to do this, have yet to emerge.

The 5Rights Foundation, an international NGO for children’s digital safety, calls the Online Safety Act a starting point, not an end. Protecting young people online demands more than legislative checkboxes. It’s not solely the job of computing teachers and tech classes; every subject should help pupils question, create, and participate responsibly in the digital world.

Digital literacy must be treated as essential infrastructure and a responsibility for all. This doesn’t stop at the school gates. Parents and adults, in general, must also reflect on their own online behaviours, modelling the critical, respectful, and mindful practices we want to see in the next generation.

Laws can set the guardrails, but only education, and example, can ensure we all know how and where to steer.

This blog was cross posted from The Scotsman website.


First published: 22 August 2025