Reclaim The Net Feed
Reclaim The Net Feed

Reclaim The Net Feed

@reclaimthenetfeed

A UK Labour Minister Just Resigned Over a Secret Plot to Silence Journalists Using a Spy Agency
Favicon 
reclaimthenet.org

A UK Labour Minister Just Resigned Over a Secret Plot to Silence Journalists Using a Spy Agency

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. In England, Member of Parliament Josh Simons resigned from the Cabinet Office on Saturday after the Guardian revealed he had personally emailed GCHQ’s National Cyber Security Centre to link journalists investigating his thinktank to a Russian disinformation campaign. The same journalists he later claimed to know nothing about. Simons had commissioned and reviewed a report by lobbying agency APCO on reporters looking into Labour Together‘s undisclosed political donations. When the Sunday Times revealed APCO’s report included baseless allegations about journalist Gabriel Pogrund’s faith, upbringing, and personal relationships, Simons said he was “surprised and shocked to read the report extended beyond the contract by including unnecessary information” on Pogrund. That statement was issued publicly. The emails to NCSC had already been sent. Those emails named Pogrund and his Times colleague Harry Yorke, and suggested their reporting could be connected to a Russian disinformation campaign. The mechanism Labour Together and APCO built was a silencing operation with an intelligence agency as the delivery vehicle. By routing a privately commissioned dossier to the National Cyber Security Centre, part of GCHQ, and briefing friendly reporters that named journalists were under investigation for receiving hacked materials or links to Russian intelligence, the operation created a threat designed to make reporters think twice before publishing. Holden described exactly how it played out in his case: “So [Labour Together] apparently took this dodgy dossier that APCO had written on me, and gave it to the National Cybersecurity Reporting Centre, and basically said, ‘We think that Paul Holden has received illegal hacks from Russia or China of the Electoral Commission, and we demand that it be investigated.’ And then they briefed that to the Guardian, and the Guardian tried to run a story on me. I said, ‘What on earth are you talking about? This is absolutely insane…’ They eventually backed off.” Kit Klarenberg of The Grayzone, separately targeted by the same tactic, identified it plainly: “It’s a complete fraud used to silence critical reporting.” A journalist who knows their name has been fed to an intelligence body alongside Russian-links allegations, however fabricated, faces a calculation that has nothing to do with whether the allegations are true. No conviction required. No formal investigation needed. The referral itself does the work. Reporters fear official trouble and drop their stories before anyone in authority has to ask them to. Simons and his chief of staff also sent a truncated version of the APCO report to NCSC officials and claimed that freelance journalist Paul Holden, who had separately investigated Labour Together, could be linked to “people known to be operating in a pro-Kremlin propaganda network with links to Russian intelligence.” There is no credible evidence that any of the three journalists had any involvement in a pro-Russian campaign. What they had in common was that they were all investigating Labour Together’s funding. Reporting on a thinktank’s undisclosed donations is journalism. The response from the people being investigated was to commission a dossier on the reporters and then route it, with allegations of Kremlin links, to a government intelligence body. The chilling effect of that move is the point. Journalists who know their work might trigger a security referral start thinking carefully about what they publish and who they call. That’s the most effective form of speech suppression because it doesn’t require anyone to be formally silenced. The story behind the Labour Together/APCO scandal stretches back to 2023, when a series of “UK Files” exposés published on Racket News began documenting the operations of the Center for Countering Digital Hate, a group with deep roots in Labour Party politics. CCDH grew out of Stop Funding Fake News, launched in 2019 by Imran Ahmed, who had worked in various Labour-adjacent roles. Paul Holden, a veteran investigator at Shadow World Investigations, had been researching CCDH for a book that would eventually be published as The Fraud: Keir Starmer, Morgan McSweeney, and the Crisis of British Democracy. The “UK Files” documented a consistent pattern of Ahmed, SFFN, and eventually CCDH manufacturing public controversies to attack the Corbynite wing of Labour and its allies. The documents that triggered the APCO investigation had originally been obtained by Al-Jazeera. One showed CCDH had provided incorrect information to the IRS “in its application to receive tax-exempt 501(c)(3) status.” A November 2023 Holden piece on that IRS issue is believed by sources close to the probe to have been at least as consequential as the Racket News piece in prompting Labour Together to commission APCO. What APCO was actually contracted to do goes well beyond the “security probe” framing Labour figures have since applied to it. The leaked contract shows APCO was tasked with identifying the sources behind reporting on Labour Together, and also with an offensive mission: to “provide a body of evidence that could be packaged up for use in the media to create narratives that would proactively undermine any future attacks on Labour Together.” As Holden put it: “It’s basically, we’re planning to go and give you stuff that you can use to go after the journalists.” The thinktank at the center of this, Labour Together, had been run by Josh Simons before he became a Labour MP and then a Cabinet Office minister. The same organization that positioned itself as a defender against misinformation and fake news was, according to Holden, now generating disinformation of its own. “One of the things that still blows my mind is that you and I were looking into an organization that claimed to be policing fake news and misinformation. And now they’re creating disinformation,” he said. If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. The post A UK Labour Minister Just Resigned Over a Secret Plot to Silence Journalists Using a Spy Agency appeared first on Reclaim The Net.

iPhone Software Beta’s UK Age Verification Screen Raises Questions
Favicon 
reclaimthenet.org

iPhone Software Beta’s UK Age Verification Screen Raises Questions

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. According to a screen in iOS 26.4, Apple is now starting to demand age verification from UK iPhone users, and the latest iOS 26.4 beta makes clear what’s at stake for anyone who declines. The move, which Apple is now calling an “error,” is a direct consequence of the UK’s Online Safety Act, a censorship law that has also forces platforms to check the identity/age eligibility of every adult user or face fines reaching 10% of global revenue. The law is controversial but British Prime Minister Keir Starmer says it doesn’t go far enough. More: The Digital ID and Online Age Verification Agenda A prompt appears after installation asking users to confirm they’re over 18. Refuse, and Apple says users “will not be able to download and purchase apps or make in-app purchases.” The verification process gives Apple several ways to build a profile of your age. It can pull from the payment method already linked to your account, use account age as a proxy, or ask you to scan a credit card. Some users may eventually be asked to scan a photo ID. Apple frames this as seamless. More: Xbox UK Age Verification Launch Locks Out Thousands of Players The logic Apple uses to automatically confirm your age reveals how much it already knows about you. “A valid credit card can help confirm you’re at least 18 because you must be an adult to open a credit card account,” the company states. Your financial history is now your age certificate. What’s interesting is that Apple says that this screen was an “error.” In a statement, Apple said: “Some users on the beta software in the UK temporarily saw a message suggesting age verification is required to download apps. That message was displayed in error, and has been fixed. Developers may continue to use the Declared Age Range API to provide age appropriate experiences for users.” That would be a big “error” for Apple to make and it leads to questions about what Apple is planning if they went so far as to create this screen. The UK’s Online Safety Act is the engine behind this. The law came into force in 2025 and is one of the most consequential pieces of internet legislation in recent UK history. The question of who actually handles verification data is its own problem. Social media companies and dating apps often outsource the age-gating process to third-party providers that collect biometric data, passport and personal identification documents, and banking and credit card information. Those providers have uneven track records. In 2024, a major data breach left administrative credentials exposed online for over a year. In late 2025, a significant breach involving a third-party support vendor exposed approximately 70,000 government ID photos used for age verification. The practical effect of all this is predictable. Apple’s UK rollout is part of a broader expansion announced this week. Users in Australia, Brazil, and Singapore will soon be blocked from downloading apps rated 18+ without verification, and Apple is sharing age category data with developers in Utah and Louisiana to satisfy local compliance requirements. Each new jurisdiction adds another layer of identity infrastructure Apple is building into its platform. If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. The post iPhone Software Beta’s UK Age Verification Screen Raises Questions appeared first on Reclaim The Net.

Rumble Shorts Is Everywhere Now, and It’s Coming for TikTok
Favicon 
reclaimthenet.org

Rumble Shorts Is Everywhere Now, and It’s Coming for TikTok

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. Rumble has completed the rollout of Rumble Shorts across every major platform. The feature went live on the web at rumble.com/shorts earlier this month. Google Play approved it for Android on February 13. Apple signed off on iOS on February 25, making it now available cross-platform. The full stack is live; vertical videos of 90 seconds or less, swipeable, personalized, and tipped through Rumble Wallet. Short-form video is the dominant format on the internet right now, and the dominant platform for it has spent the past year demonstrating exactly what centralized content control looks like at scale. TikTok’s censorship controversies, its terms of service rewrites, and the political pressure applied to its existence have pushed creators and viewers toward alternatives. Rumble is the most prominent video platform built around a different premise. “Rumble Shorts provides a digestible viewing option, and of course, there’s free speech with every swipe,” said Rumble CEO Chris Pavlovski at launch. “With Rumble Shorts, we stay true to our mission of defending free expression, while we also boost creator discovery and offer even more opportunities for creators and channels to grow and get paid.” The contrast with TikTok’s model is built into the product. On TikTok, the algorithm decides what you see, and the platform decides what creators are allowed to say. Rumble’s Shorts feed is personalized, too, but the platform’s pitch is that what fills it won’t be quietly throttled for political content, that a creator who posts views inconvenient to governments or advertisers won’t find their videos mysteriously failing to reach anyone. “Rumble Shorts delivers quick, easy-to-consume videos, with free speech built into every swipe,” Pavlovski said following the iOS approval. “At Rumble, we are reinforcing our commitment to protecting open expression at the same time we’re helping creators get discovered, expand their audiences, and increase their earnings. Once again, Rumble is showing why it’s the top destination for content creators.” The monetization angle is important here, too. Rumble Shorts integrates tipping through Rumble Wallet, a crypto-based payment system the company launched earlier this year. Creators on platforms that censor them don’t only lose reach, they lose income. A platform where speech and payment infrastructure are both outside the reach of a single app store or payment processor is a meaningfully different thing from YouTube or TikTok, where demonetization is a routine enforcement tool. If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. The post Rumble Shorts Is Everywhere Now, and It’s Coming for TikTok appeared first on Reclaim The Net.

US Republican Lawmakers Demand Answers on UK’s iCloud Encryption Backdoor Order
Favicon 
reclaimthenet.org

US Republican Lawmakers Demand Answers on UK’s iCloud Encryption Backdoor Order

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. Two senior Republican lawmakers are demanding answers from the British government about its secret order forcing Apple to break its own encryption. The UK has until March 11 to respond. House Judiciary Committee Chairman Jim Jordan and Foreign Affairs Committee Chairman Brian Mast sent a joint letter on Wednesday to Home Secretary Shabana Mahmood, pressing for a formal briefing on the Technical Capability Notice (TCN) served on Apple under the UK’s Investigatory Powers Act. We obtained a copy of the letter for you here. It’s the latest move in a surveillance fight that began over a year ago and has rattled the US-UK relationship at the highest levels. In January 2025, UK security officials secretly ordered Apple to build a backdoor into iCloud that would allow them to decrypt any user’s data, anywhere in the world. Not just suspected criminals, not just UK citizens. Everyone. The order targeted Apple’s Advanced Data Protection (ADP) feature, the optional end-to-end encryption that ensures even Apple can’t read iCloud backups. Apple’s response was to pull ADP from the UK market entirely in February 2025, stripping strong encryption options from roughly 35 million iPhone users rather than comply with a demand it couldn’t legally discuss. UK law makes it a criminal offense for companies to confirm or deny the existence of such orders, even to their own government. Apple couldn’t tell the US Department of Justice that the order existed. The DOJ couldn’t verify whether it complied with the CLOUD Act, the bilateral agreement governing how the two countries share access to digital evidence. That agreement explicitly states it “shall not create any obligation that providers be capable of decrypting data.” The UK’s order appears to do exactly that. The reaction in Washington was bipartisan. Senator Ron Wyden and Congressman Andy Biggs slammed the order as “effectively a foreign cyber attack waged through political means.” President Trump compared the UK’s conduct directly to China’s. Speaking to the Spectator after meeting Prime Minister Keir Starmer, Trump said: “We actually told [Starmer] . . . that’s incredible. That’s something, you know, that you hear about with China.” DNI Secretary Tulsi Gabbard called any attempt to compel Apple to create security weaknesses an “egregious violation” of privacy and confirmed legal and intelligence teams were assessing the implications. The pressure worked, partially. Gabbard announced in August, 2025, that Britain had backed down from its demand for access to data belonging to Americans. But British citizens are still at risk. A UK government spokesperson reaffirmed its commitment to cooperating with the US on security while also protecting civil liberties, stating: “We will always take all actions necessary at the domestic level to keep UK citizens safe.” The core demand, reshaped with a domestic focus, reportedly remained on the table. After the original order triggered backlash in Washington, the British government reissued the demand with a domestic focus, seemingly to avoid international fallout. The legal infrastructure that produced the original order hasn’t gone anywhere. In a telling sign of internal contradictions, the UK’s National Cyber Security Centre quietly removed its own guidance advising users to enable Advanced Data Protection on iOS devices, without any public explanation. At the same time, the Home Office was pressuring Apple to undermine that very same encryption. Jordan and Mast first raised these concerns in May 2025 with then-Home Secretary Yvette Cooper. They received a response from Security Minister Dan Jarvis in June. The relevant details remained undisclosed. Their letter to Mahmood states plainly that they have “serious concerns that the actions of the UK are weakening the security, privacy, and constitutional rights of American citizens.” The UK’s Investigatory Powers Tribunal ruled in April 2025 that “bare details” of Apple’s legal challenge could be made public. Justices Rabinder Singh and Jeremy Charles Johnson stated “[w]e do not accept that the revelation of the bare details of the case would be damaging to the public interest or prejudicial to national security.” What that ruling confirmed: a legal demand for backdoor access existed. Everything beyond that remains classified. The Committees are also citing the UK’s own Investigatory Powers Commissioner, who stated in his December 2025 annual report that “[w]e welcome the decision of the Tribunal to order that the bare facts of the case to be disclosed to the public as we consider it is vitally important for there to be a mature and informed public debate about lawful access capabilities.” Jordan and Mast quote it back directly at Mahmood: “For there to be a ‘mature and informed public debate,’ it is imperative that the Committees fully understand the actions taken by the UK government with respect to the TCN issued to Apple.” Their request is straightforward: a briefing before March 11, and permission for Apple to disclose TCN details “to the U.S. Department of Justice to ensure compliance with our bilateral agreement with the UK under the CLOUD Act.” A year of asking has produced a letter from a junior minister and classified silence. The Apple case is not an isolated incident. Under Starmer, arrests over social media posts have become common, police forces have expanded facial recognition with minimal oversight, and the government is advancing plans for a centralized digital identity system. The encryption fight is part of a pattern. If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. The post US Republican Lawmakers Demand Answers on UK’s iCloud Encryption Backdoor Order appeared first on Reclaim The Net.

FTC Says Companies Can Collect Kids’ Personal Data, As Long As It’s Called “Age Verification”
Favicon 
reclaimthenet.org

FTC Says Companies Can Collect Kids’ Personal Data, As Long As It’s Called “Age Verification”

If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. The FTC just told companies they can collect children’s personal data without parental consent, as long as it’s for “age verification.” That’s the practical effect of a policy statement the agency issued this week. Under COPPA, websites collecting data on kids under 13 generally need verifiable parental consent first. The FTC’s new statement carves out an exception: gather whatever personal information you need to verify someone’s age, and the Commission won’t come after you for it. The agency calls this child protection. The infrastructure it’s enabling looks different. Christopher Mufarrige, director of the FTC’s Bureau of Consumer Protection, said “Age verification technologies are some of the most child-protective technologies to emerge in decades,” and framed the announcement as a tool for parents. What the statement actually does is green-light personal data collection from minors, on the theory that knowing someone’s age requires knowing who they are first. The exemption is conditional. To avoid enforcement, sites must delete age verification data “promptly” after use, restrict third-party sharing to vendors with adequate security assurances, post clear notices about what they’re collecting, and use methods likely to produce “reasonably accurate” results. These requirements are unverifiable by the people whose data gets collected, and enforced by an agency that just announced it won’t enforce. COPPA supposedly exists precisely because children’s personal data is sensitive and companies can’t be trusted to protect it without legal pressure. The FTC’s new exemption uses that same sensitive data as the price of admission for age verification, then steps back from enforcement. The agency is weakening the law’s protections in order to expand the infrastructure that the law was supposedly designed to regulate. The breach record makes that contradiction concrete. Discord’s disclosure that roughly 70,000 users may have had government IDs exposed through a third-party vendor handling age appeals was a preview of what’s to come. Identity documents collected for age verification are high-value targets stored and processed in systems that, as Tea demonstrated months later, can be left completely unsecured. Collecting more of that data, from more platforms, under softer enforcement, doesn’t protect children. It just creates more of the same risk at a greater scale. That breach is instructive. The IDs weren’t collected by Discord as part of general surveillance. They were collected specifically for verification. The FTC’s new framework would permit exactly that arrangement while calling it a safeguard. Policy statements describe how an agency plans to use enforcement discretion, not law. The FTC acknowledged this implicitly by announcing it also intends to review the COPPA Rule itself to formally address age verification. The current statement stays in effect until that review concludes in revised regulations, or until the agency pulls it back. What gets built in the meantime is the issue. Every site that deploys age verification creates a new pipeline: identity documents, biometric scans, or behavioral profiles flowing to platforms to third-party vendors, all of it operating under standards the FTC won’t actively police. If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net. The post FTC Says Companies Can Collect Kids’ Personal Data, As Long As It’s Called “Age Verification” appeared first on Reclaim The Net.