Close Menu
Core Bulletin

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Strictly Come Dancing reveals Gladiators star Harry Aikines-Aryeetey as first 2025 celebrity

    August 11, 2025

    ‘I was too good’: Sharon Stone on stardom, family secrets, sexual abuse – and her comeback after a stroke | Sharon Stone

    August 11, 2025

    Premier League: How important is a quick start to the season?

    August 11, 2025
    Facebook X (Twitter) Instagram
    Core BulletinCore Bulletin
    Trending
    • Strictly Come Dancing reveals Gladiators star Harry Aikines-Aryeetey as first 2025 celebrity
    • ‘I was too good’: Sharon Stone on stardom, family secrets, sexual abuse – and her comeback after a stroke | Sharon Stone
    • Premier League: How important is a quick start to the season?
    • EU leaders push for Kyiv to be part of Trump-Putin talks to end Ukraine war
    • Over-70s face driving ban for failing eye tests
    • Musk’s Tesla applies to supply power to British households
    • Rod Fergusson leaves Blizzard after five years leading Diablo
    • ‘The Shining,’ Wes Anderson Costume Designer Milena Canonero: Locarno
    Monday, August 11
    • Home
    • Business
    • Health
    • Lifestyle
    • Politics
    • Science
    • Sports
    • Travel
    • World
    • Technology
    • Entertainment
    Core Bulletin
    Home»Business»Everything the right – and the left – are getting wrong about the Online Safety Act | George Billinge
    Business

    Everything the right – and the left – are getting wrong about the Online Safety Act | George Billinge

    By Liam PorterAugust 2, 2025No Comments6 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Telegram Email
    Everything the right - and the left – are getting wrong about the Online Safety Act | George Billinge
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Last week, the UK’s Online Safety Act came into force. It’s fair to say it hasn’t been smooth sailing. Donald Trump’s allies have dubbed it the “UK’s online censorship law”, and the technology secretary, Peter Kyle, added fuel to the fire by claiming that Nigel Farage’s opposition to the act put him “on the side” of Jimmy Savile.

    Disdain from the right isn’t surprising. After all, tech companies will now have to assess the risk their platforms pose of disseminating the kind of racist misinformation that fuelled last year’s summer riots. What has particularly struck me, though, is the backlash from progressive quarters. Online outlet Novara Media published an interview claiming the Online Safety Act compromises children’s safety. Politics Joe joked that the act involves “banning Pornhub”. New YouGov polling shows that Labour voters are even less likely to support age verification on porn websites than Conservative or Liberal Democrat voters.

    I helped draft Ofcom’s regulatory guidance setting out how platforms should comply with the act’s requirements on age verification. Because of the scope of the act and the absence of a desire to force tech platforms to adopt specific technologies, this guidance was broad and principles-based – if the regulator prescribed specific measures, it would be accused of authoritarianism. Taking a principles-based approach is more sensible and future proof, but does allow tech companies to interpret the regulation poorly.

    Despite these challenges, I am supportive of the principles of the act. As someone with progressive politics, I have always been deeply concerned about the impact of an unregulated online world. Bad news abounds: X allowing racist misinformation to spread in the name of “free speech”; and children being radicalised or being targeted by online sexual extortion. It was clear to me that these regulations would start to move us away from a world in which tech billionaires could dress up self-serving libertarianism as lofty ideals.

    Instead, a culture war has erupted that is laden with misunderstanding, with every poor decision made by tech platforms being blamed on regulation. This strikes me as incredibly convenient for tech companies seeking to avoid accountability.

    So what does the act actually do? In short, it requires online services to assess the risk of harm – whether illegal content such as child sexual abuse material, or, in the case of services accessed by children, content such as porn or suicide promotion – and implement proportionate systems to reduce those risks.

    It’s also worth being clear about what isn’t new. Tech companies have been moderating speech and taking down content they don’t want on their platforms for years. However, they have done so based on opaque internal business priorities, rather than in response to proactive risk assessments.

    Let’s look at some examples. After the Christchurch terror attack in New Zealand, which was broadcast in a 17-minute Facebook Live post and shared widely by white supremacists, Facebook trained its AI to block violent live streams. More recently, after Trump’s election, Meta overhauled its approach to content moderation and removed factchecking in the US, a move which its own oversight board has criticised as being too hasty.

    Rather than making decisions to remove content reactively, or in order to appease politicians, tech companies will now need to demonstrate they have taken reasonable steps to prevent this content from appearing in the first place. The act isn’t about “catching baddies”, or taking down specific pieces of content. Where censorship has happened, such as the suppression of pro-Palestine speech, this has been taking place long before the implementation of the Online Safety Act. Where public interest content is being blocked as a result of the act, we should be interrogating platforms’ risk assessments and decision-making processes, rather than repealing the legislation. Ofcom’s new transparency powers make this achievable in a way that wasn’t possible before.

    Yes, there are some flaws with the act, and teething issues will persist. As someone who worked on Ofcom’s guidance on age verification, even I am slightly confused by the way Spotify is checking users’ ages. The widespread adoption of VPNs to circumvent age checks on porn sites is clearly something to think about carefully. Where should age assurance be implemented in a user journey? And who should be responsible for informing the public that many age assurance technologies delete all of their personal data after their age is confirmed, while some VPN providers sell their information to data brokers? But the response to these issues shouldn’t be to repeal the Online Safety Act: it should be for platforms to hone their approach.

    There is an argument that the problem ultimately lies with the business models of the tech industry, and that this kind of legislation will never be able to truly tackle that. The academic Shoshana Zuboff calls this “surveillance capitalism”: tech companies get us hooked through addictive design and extract huge amounts of our personal data in order to sell us hyper-targeted ads. The result is a society characterised by atomisation, alienation and the erosion of our attention spans. Because the easiest way to get us hooked is to show us extreme content, children are directed from fitness influencers to content promoting disordered eating. Add to this the fact that platforms are designed to make people expand their networks and spend as much time on them as possible, and you have a recipe for disaster.

    Again, it’s a worthy critique. But we live in a world where American tech companies hold more power than many nation states – and they have a president in the White House willing to start trade wars to defend their interests.

    So yes, let’s look at drafting regulation that addresses addictive algorithms and support alternative business models for tech platforms, such as data cooperatives. Let’s continue to explore how best to provide children with age-appropriate experiences online, and think about how to get age verification right.

    But while we’re working on that, really serious harms are taking place online. We now have a sophisticated regulatory framework in the UK that forces tech platforms to assess risk and allows the public to have far greater transparency over their decision-making processes. We need critical engagement with the regulation, not cynicism. Let’s not throw out the best tools we have.

    • George Billinge is a former Ofcom policy manager and is CEO of tech consultancy Illuminate Tech

    • Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

    act Billinge George left online safety wrong
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Liam Porter
    • Website

    Liam Porter is a seasoned news writer at Core Bulletin, specializing in breaking news, technology, and business insights. With a background in investigative journalism, Liam brings clarity and depth to every piece he writes.

    Related Posts

    Musk’s Tesla applies to supply power to British households

    August 11, 2025

    B&Q boss urges Reeves to end tax breaks that favour online Chinese rivals | Kingfisher

    August 11, 2025

    Universal theme park sparks ‘fear of losing homes’ in Bedford

    August 11, 2025

    China’s unemployed young adults who are pretending to have jobs

    August 11, 2025

    Carrie Coon on Huge Price for Bertha and George

    August 11, 2025

    Chip giants Nvidia and AMD to pay 15% of China revenue to US

    August 11, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Medium Rectangle Ad
    Don't Miss
    Entertainment

    Strictly Come Dancing reveals Gladiators star Harry Aikines-Aryeetey as first 2025 celebrity

    August 11, 2025

    Gladiators star and Olympic sprinter Harry Aikines-Aryeetey is the first celebrity contestant to be announced…

    ‘I was too good’: Sharon Stone on stardom, family secrets, sexual abuse – and her comeback after a stroke | Sharon Stone

    August 11, 2025

    Premier League: How important is a quick start to the season?

    August 11, 2025

    EU leaders push for Kyiv to be part of Trump-Putin talks to end Ukraine war

    August 11, 2025
    Our Picks

    Reform council confirms ‘patriotic’ flag policy

    July 4, 2025

    Trump references bankers with antisemitic slur in Iowa speech to mark megabill’s passage – as it happened | Donald Trump

    July 4, 2025

    West Indies v Australia: Tourists bowled out for 286 in Grenada Test

    July 4, 2025

    Beards may be dirtier than toilets – but all men should grow one | Polly Hudson

    July 4, 2025
    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Medium Rectangle Ad
    About Us

    Welcome to Core Bulletin — your go-to source for reliable news, breaking stories, and thoughtful analysis covering a wide range of topics from around the world. Our mission is to inform, engage, and inspire our readers with accurate reporting and fresh perspectives.

    Our Picks

    Strictly Come Dancing reveals Gladiators star Harry Aikines-Aryeetey as first 2025 celebrity

    August 11, 2025

    ‘I was too good’: Sharon Stone on stardom, family secrets, sexual abuse – and her comeback after a stroke | Sharon Stone

    August 11, 2025
    Recent Posts
    • Strictly Come Dancing reveals Gladiators star Harry Aikines-Aryeetey as first 2025 celebrity
    • ‘I was too good’: Sharon Stone on stardom, family secrets, sexual abuse – and her comeback after a stroke | Sharon Stone
    • Premier League: How important is a quick start to the season?
    • EU leaders push for Kyiv to be part of Trump-Putin talks to end Ukraine war
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms and Conditions
    © 2025 Core Bulletin. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.