Social media executives from Meta, Snap, YouTube, TikTok and X are being summoned to Downing Street on Thursday for a crucial meeting with Prime Minister Sir Keir Starmer and Technology Secretary Liz Kendall over online safety for children. The tech bosses will face questioning about what measures they are taking to protect young users and address parental concerns, as the government continues its review on whether to implement a complete prohibition on social media for under-16s, in line with Australia’s approach. Sir Keir has stressed that the meeting will centre on ensuring “social media companies step up and take responsibility”, warning that “the consequences of not taking action are stark” and that the government owes it to parents and the next generation to prioritise children’s safety.
The Number 10 Confrontation
Thursday’s gathering constitutes a critical moment in the government’s push to hold tech giants accountable for their role in protecting vulnerable young users. The gathering comes at a crucial juncture, with Parliament having dismissed calls for an outright ban on social media for those under 16 just hours earlier, despite backing from the House of Lords. Instead of implementing a broad prohibition, MPs chose to give ministers powers to establish their own restrictions, signalling the government’s inclination for a more tailored regulatory approach rather than a sweeping legislative ban.
The timing of the Downing Street summit underscores the government’s resolve to appear firm on internet safety whilst managing complex political and commercial pressures. Professor Gina Neff from the University of Cambridge’s Minderby Centre for Technology and Democracy noted the summit permits the government to demonstrate it is acting proactively on digital harms. Downing Street has already acknowledged that some services have advanced, implementing steps such as disabling autoplay for children by standard, and offering parents greater controls over device usage, though critics contend considerably more must be achieved.
- Tech chief figures questioned on safeguarding measures and how they address parent worries
- The government considering ban on social platforms for children under 16 following Australian model
- MPs voted against complete prohibition but provided ministers ability to introduce restrictions
- Some services already introduced protections like stopping autoplay for children
Parliamentary Rejection and the Broader Debate
Wednesday evening’s House vote proved damaging to supporters of a complete ban on social media for those under 16, marking the second occasion MPs have dismissed such proposals despite considerable backing from the House of Lords. The government’s decision to prioritise ministerial flexibility over formal legislation demonstrates a more conservative strategy, with officials contending that an outright ban would be premature given continuing policy discussions. This strategy allows the government flexibility in designing tailored controls rather than introducing a sweeping ban that some fear could prove difficult to enforce and effectively oversee across multiple platforms.
The rejection has amplified discussion regarding whether the UK is properly shielding its youth from digital dangers. Whilst the authorities contend that giving ministers authority to implement bespoke guidelines represents a more pragmatic solution, critics argue this approach lacks the decisive action the situation requires. Recent studies conducted in Australia, where an social media restriction for those under 16 was established in December 2025, reveals that over 60 per cent of young users continue accessing platforms regardless, prompting significant concerns about the effectiveness of legislative bans and suggesting the challenge extends far beyond straightforward bans.
Criticism Across Parties
The parliamentary ruling has drawn sharp opposition from opposition benches. Conservative shadow education secretary Laura Trott accused Labour MPs of failing parents and children by rejecting the ban, maintaining that other nations are acknowledging social media’s harms whilst the UK falls behind under the current government. Liberal Democrat education spokeswoman Munira Wilson shared these reservations, stating that “the time for partial solutions is over” and insisting on immediate intervention to restrict the most damaging platforms for young users rather than piecemeal regulatory changes.
Australia’s Cautionary Example
Australia’s track record with online platform restrictions offers a sobering case study for policymakers considering comparable approaches in the UK. When the country introduced a prohibition on online platforms for under-16s in December 2025, it was hailed as a significant milestone in protecting young people from digital risks. However, new findings from the Molly Rose Foundation has uncovered a troubling picture: more than 60 per cent of young Australians continue using social media platforms despite the legislative prohibition. This significant non-compliance rate indicates that legal prohibitions alone could be inadequate in stopping determined young users from accessing the services they wish to use.
The Australian research carry significant implications for the UK’s continuing policy deliberations. If a comparable ban were introduced in Britain, the evidence suggests implementation would pose substantial challenges, with young people likely finding ways to circumvent age-verification systems and restrictions through multiple technical means. The data undermines arguments that a simple legislative prohibition represents a quick fix to digital safety issues, instead pointing towards the need for a broader approach integrating regulatory frameworks, platform accountability, parental oversight tools, and digital literacy training to meaningfully address the risks young people encounter online.
| Key Finding | Implication |
|---|---|
| Over 60% of underage Australians still access social media despite ban | Legislative prohibitions alone cannot effectively prevent determined young users from accessing platforms |
| Ban introduced in December 2025 has failed to achieve widespread compliance | Enforcement mechanisms remain weak and young people find workarounds to restrictions |
| Blanket bans do not address underlying appeal of social media to young people | Multi-faceted approach combining regulation, platform accountability, and education is necessary |
Leading Specialists Urge Substantive Measures
Child safety advocates and digital rights experts have stepped up demands for tech companies to implement meaningful action beyond voluntary measures. The Molly Rose Foundation, created to honour 14-year-old Molly Russell who took her own life after accessing dangerous material on the internet, has been especially outspoken in demanding systemic change. Rather than implementing sweeping prohibitions that prove difficult to enforce, campaigners argue the focus must shift towards making companies responsible for the algorithms that promote dangerous material to vulnerable users.
Andy Burrows, head of the Molly Rose Foundation, has stressed that Thursday’s meeting at Downing Street constitutes a critical moment for state intervention. The charity has repeatedly maintained that platforms have the technological means to introduce strong protections, yet frequently place user engagement figures over user wellbeing. Experts stress that real safeguarding demands platforms to redesign their algorithmic recommendations, improve moderation practices, and offer parents with meaningful tools to track their kids’ internet use successfully.
The Algorithm Issue
At the heart of concerns sits the algorithmic systems that determine what content younger audiences see. These algorithms are designed to boost user engagement, often pushing sensational, harmful, or addictive content to at-risk groups. Overhauling these mechanisms constitutes one of the most critical issues in online safety, demanding platform transparency about how their recommendation engines operate and what protective measures are in place.
- Algorithms prioritise engagement over user wellbeing and safety
- Platforms need to improve transparency about how content is recommended
- Independent audits of harm caused by algorithms are crucial for ensuring accountability
What Happens Next
Thursday’s summit at Downing Street will set the tone for the government’s position regarding online child safety in the coming months. Following the meeting, Sir Keir Starmer and Liz Kendall are expected to outline their results and determine whether current voluntary schemes from tech companies prove sufficient or whether enhanced statutory intervention becomes necessary. The government remains midway through its public engagement exercise on whether to introduce an Australia-style ban on social media for under-16s, with the outcome of this week’s discussions likely to shape the final policy direction.
Ministers have signalled their preference for giving themselves powers to place limitations rather than implementing an outright ban, citing anxieties over enforceability and impact. However, increasing pressure from opposition MPs, child safety groups, and parents suggests the government may face continued demands for stronger action. The weeks ahead will be pivotal in establishing whether digital platforms can show real commitment to safeguarding young people or whether Parliament will introduce new laws to enforce compliance with stricter safety standards.