DeepLinks from the EFF
DeepLinks from the EFF

DeepLinks from the EFF

@deeplinks

EFF Awards Spotlight ✨ Software Freedom Law Center, India
Favicon 
www.eff.org

EFF Awards Spotlight ✨ Software Freedom Law Center, India

In 1992 EFF presented our very first awards recognizing key leaders and organizations advancing innovation and championing civil liberties and human rights online. Now in 2025 we're continuing to celebrate the accomplishments of people working toward a better future for everyone with the EFF Awards!All are invited to attend the EFF Awards on Wednesday, September 10 at the San Francisco Design Center. Whether you're an activist, an EFF supporter, a student interested in cyberlaw, or someone who wants to munch on a strolling dinner with other likeminded individuals, anyone can enjoy the ceremony!REGISTER TODAY!GENERAL ADMISSION: $55 | CURRENT EFF MEMBERS: $45 | STUDENTS: $35If you're not able to make it, we'll also be hosting a livestream of the event on Friday, September 12 at 12:00 PM PT. The event will also be recorded, and posted to YouTube and the Internet Archive after the livestream.We are honored to present the three winners of this year's EFF Awards: Just Futures Law, Erie Meyer, and Software Freedom Law Center, India. But, before we kick off the ceremony next week, let's take a closer look at each of the honorees. And last, but certainly not least—Software Freedom Law Center, India, winner of the EFF Award for Defending Digital Freedoms:Software Freedom Law Center, India is a donor-supported legal services organization based in India that brings together lawyers, policy analysts, students, and technologists to protect freedom in the digital world. It promotes innovation and open access to knowledge by helping developers make great free and open-source software, protects privacy and civil liberties for Indians by educating and providing free legal advice, and helps policymakers make informed and just decisions about use of technology. SFLC.IN tracks and participates in litigation, AI regulations, and free speech issues that are defining Indian technology. It also tracks internet shutdowns and censorship incidents across India, provides digital security training, and has launched the Digital Defenders Network, a pan-Indian network of lawyers committed to protecting digital rights. It has conducted landmark litigation cases, petitioned the government of India on freedom of expression and internet issues, and campaigned for WhatsApp and Facebook to fix a feature of their platform that has been used to harass women in India. We're excited to celebrate SFLC.IN and the other EFF Award winners in person in San Francisco on September 10! We hope that you'll join us there.Thank you to Fastly, DuckDuckGo, Corellium, and No Starch Press for their year-round support of EFF's mission.Want to show your team’s support for EFF? Sponsorships ensure we can continue hosting events like this to build community among digital rights supporters. Please visit eff.org/thanks or contact tierney@eff.org for more information on corporate giving and sponsorships.EFF is dedicated to a harassment-free experience for everyone, and all participants are encouraged to view our full Event Expectations.Questions? Email us at events@eff.org.

Age Verification Is A Windfall for Big Tech—And A Death Sentence For Smaller Platforms
Favicon 
www.eff.org

Age Verification Is A Windfall for Big Tech—And A Death Sentence For Smaller Platforms

If you live in Mississippi, you may have noticed that you are no longer able to log into your Bluesky or Dreamwidth accounts from within the state. That’s because, in a chilling early warning sign for the U.S., both social platforms decided to block all users in Mississippi from their services rather than risk hefty fines under the state’s oppressive age verification mandate.  If this sounds like censorship to you, you’re right—it is. But it’s not these small platforms’ fault. This is the unfortunate result of Mississippi’s wide-sweeping age verification law, H.B. 1126. Though the law had previously been blocked by a federal district court, the Supreme Court lifted that injunction last month, even as one justice (Kavanaugh) concluded that the law is “likely unconstitutional.” This allows H.B. 1126 to go into effect while the broader constitutional challenge works its way through the courts. EFF has opposed H.B. 1126 from the start, arguing consistently and constantly that it violates all internet users’ First Amendment rights, seriously risks our privacy, and forces platforms to implement invasive surveillance systems that ruin our anonymity.  Lawmakers often sell age-verification mandates as a silver bullet for Big Tech’s harms, but in practice, these laws do nothing to rein in the tech giants. Instead, they end up crushing smaller platforms that can’t absorb the exorbitant costs. Now that Mississippi’s mandate has gone into effect, the reality is clear: age verification laws entrench Big Tech’s dominance, while pushing smaller communities like Bluesky and Dreamwidth offline altogether.  Sorry Mississippians, We Can’t Afford You Bluesky was the first platform to make the announcement. In a public blogpost, Bluesky condemned H.B. 1126’s broad scope, barriers to innovation, and privacy implications, explaining that the law forces platforms to “make every Mississippi Bluesky user hand over sensitive personal information and undergo age checks to access the site—or risk massive fines.” As Bluesky noted, “This dynamic entrenches existing big tech platforms while stifling the innovation and competition that benefits users.” Instead, Bluesky made the decision to cut off Mississippians entirely until the courts consider whether to overturn the law.  About a week later, we saw a similar announcement from Dreamwidth, an open-source online community similar to LiveJournal where users share creative writing, fanfiction, journals, and other works. In its post, Dreamwidth shared that it too would have to resort to blocking the IP addresses of all users in Mississippi because it could not afford the hefty fines.  Dreamwidth wrote: “Even a single $10,000 fine would be rough for us, but the per-user, per-incident nature of the actual fine structure is an existential threat.” The service also expressed fear that being involved in the lawsuit against Mississippi left it particularly vulnerable to retaliation—a clear illustration of the chilling effect of these laws. For Dreamwidth, blocking Mississippi users entirely was the only way to survive.  Age Verification Mandates Don’t Rein In Big Tech—They Entrench It Proponents of age verification claim that these mandates will hold Big Tech companies accountable for their outsized influence, but really the opposite is true. As we can see from Mississippi, age verification mandates concentrate and consolidate power in the hands of the largest companies—the only entities with the resources to build costly compliance systems and absorb potentially massive fines. While megacorporations like Google (with YouTube) and Meta (with Instagram) are already experimenting with creepy new age-estimation tech on their social platforms, smaller sites like Bluesky and Dreamwidth simply cannot afford the risks.  We’ve already seen how this plays out in the UK. When the Online Safety Act came into force recently, platforms like Reddit, YouTube, and Spotify implemented broad (and extremely clunky) age verification measures while smaller sites, including forums on parenting, green living, and gaming on Linux, were forced to shutter. Take, for example, the Hamster Forum, “home of all things hamstery,” which announced in March 2025 that the OSA would force it to shut down its community message boards. Instead, users were directed to migrate over to Instagram with this wistful disclaimer: “It will not be the same by any means, but . . . We can follow each other and message on there and see each others [sic] individual posts and share our hammy photos and updates still.”  When smaller platforms inevitably cave under the financial pressure of these mandates, users will be pushed back to the social media giants. This perfectly illustrates the market impact of online age verification laws. When smaller platforms inevitably cave under the financial pressure of these mandates, users will be pushed back to the social media giants. These huge companies—those that can afford expensive age verification systems and aren’t afraid of a few $10,000 fines while they figure out compliance—will end up getting more business, more traffic, and more power to censor users and violate their privacy.  This consolidation of power is a dream come true for the Big Tech platforms, but it’s a nightmare for users. While the megacorporations get more traffic and a whole lot more user data (read: profit), users are left with far fewer community options and a bland, corporate surveillance machine instead of a vibrant public sphere. The internet we all fell in love with is a diverse and colorful place, full of innovation, connection, and unique opportunities for self-expression. That internet—our internet—is worth defending. TAKE ACTION Don't let congress censor the internet

EFF Joins 55 Civil Society Organizations Urging the End of Sanctions on UN Special Rapporteur Francesca Albanese
Favicon 
www.eff.org

EFF Joins 55 Civil Society Organizations Urging the End of Sanctions on UN Special Rapporteur Francesca Albanese

Following the U.S. government's overreaching decision to impose sanctions against Francesca Albanese, the United Nations Special Rapporteur on the situation of human rights in the Palestinian territories occupied since 1967, EFF joined more than 50 civil society organizations in calling for the U.S. to lift the sanctions.  The U.S.’s sanctions on Francesca Albanese were formally issued in July 2025, pursuant to Section 1(a)(ii)(A) of President Trump’s Executive Order 14203, which was imposed by the U.S. on the International Criminal Court (ICC) in February for having “engaged in illegitimate and baseless actions targeting America and our close ally Israel.” Under this Executive Order, the State Department is instructed to name specific people who have worked with or for the ICC.  Rapporteur Albanese joins several ICC judges and the lead prosecutor in having their U.S. property and interests in property blocked, as well as restrictions on entering the country, banking, and more.  One of the reasons cited in the far-reaching U.S. sanction is Albanese’s engagement with the ICC to investigate or prosecute nationals of the U.S. and Israel. The sanction came just days after the publication of the Special Rapportuer’s recent report to the UN Human Rights Council, “From economy of occupation to economy of genocide.” In her report, the Special Rapporteur “urges the International Criminal Court and national judiciaries to investigate and prosecute corporate executives and/or corporate entities for their part in the commission of international crimes and laundering of the proceeds from those crimes.”  As a UN Special Rapporteur, Albanese’s role is to conduct independent research, gather information, and prepare reports on human rights situations, including documenting violations and providing recommendations to the Human Rights Council and other Human Rights bodies. Special Rapporteurs are independent experts chosen by the UN Human Rights Council in Geneva. They do not represent the UN or hold any formal authority, but their reports and findings are essential for advocacy in transnational situations, informing prosecutors at the International Criminal Court, or pressuring counties for human rights abuses.  The unilateral sanctions imposed on the UN Special Rapporteur not only target her as an individual but also threaten the broader international human rights framework, undermining crucial work in monitoring and reporting on human rights issues. Such measures risk politicizing their mandates, discouraging frank reporting, and creating a chilling effect on human rights defenders more broadly. With the 80th session of the UN General Assembly opening in New York this September, these sanctions and travel restrictions present an amplified impingement on the Special Rapporteur’s capacity to fulfill her mandate and report on human rights abuses in Palestine. The Special Rapportuer’s report identifies how AI, cloud services, biometric surveillance, and predictive policing technologies have reinforced military operations, population control and the unlawful targeting of civilians in the ongoing genocide in Gaza. More specifically, it illuminates the role of U.S. tech giants like Microsoft, Alphabet (Google’s parent company), Amazon, and IBM in providing dual-use infrastructure to “integrate mass data collection and surveillance, while profiting from the unique testing ground for military technology offered by the occupied Palestinian territory.”   This report is well within her legal mandate to investigate and report on human rights issues in Palestine and provide critical oversight and accountability for human rights abuses. This work is particularly essential at a time when the very survival of Palestinians in the occupied Gaza Strip is at stake—journalists are being killed with deplorable frequency; internet shutdowns and biased censorship by social media platforms are preventing vital information from circulating within and leaving Gaza; and U.S.-based tech companies are continuing to be opaque about their role in providing technologies to the Israeli authorities for use in the ongoing genocide against Palestinians, despite the mounting evidence.  EFF has repeatedly called for greater transparency relating to the role of Big Tech companies like Google, Amazon, and Microsoft in human rights abuses across Gaza and the West Bank, with these U.S.-based companies coming under pressure to reveal more about the services they provide and the nature of their relationships with the Israeli forces engaging in the military response. Without greater transparency, the public cannot tell whether these companies are complying with human rights standards—both those set by the United Nations and those they have publicly set for themselves. We know that this conflict has resulted in alleged war crimes and has involved massive, ongoing surveillance of civilians and refugees living under what international law recognizes as an illegal occupation. That kind of surveillance requires significant technical support and it seems unlikely that it could occur without any ongoing involvement by the companies providing the platforms.  Top UN human rights officials have called for the reversal of the sanctions against the Special Rapporteur, voicing serious concerns about the dangerous precedent this sets in undermining human rights. The UN High Commissioner for Human Rights, Volker Türk, called for a prompt reversal of the sanctions and noted that, “even in the face of fierce disagreement, UN member states should engage substantively and constructively, rather than resort to punitive measures.” Similarly, UN Spokesperson Stéphane Dujarric noted that whilst Member States “are perfectly entitled to their views and to disagree with” experts’ reports, they should still “engage with the UN’s human rights architecture.” In a press conference, Albanese said she believed that the sanctions were calculated to weaken her mission, and questioned why they had even been introduced: “for having exposed a genocide? For having denounced the system? They never challenged me on the facts.” The United States must reverse these sanctions, and respect human rights for all—not just for the people they consider worthy of having them. Read our full civil society letter here.

California Lawmakers: Support S.B. 524 to Rein in AI Written Police Reports
Favicon 
www.eff.org

California Lawmakers: Support S.B. 524 to Rein in AI Written Police Reports

EFF urges California state lawmakers to pass S.B. 524, authored by Sen. Jesse Arreguín. This bill is an important first step in regaining control over police using generative AI to write their narrative police reports.  This bill does several important things: It mandates that police reports written by AI include disclaimers on every page or within the body of the text that make it clear that this report was written in part or in total by a computer. It also says that any reports written by AI must retain their first draft. That way, it should be easier for defense attorneys, judges, police supervisors, or any other auditing entity to see which portions of the final report were written by AI and which parts were written by the officer. Further, the bill requires officers to sign and verify that they read the report and its facts are correct. And it bans AI vendors from selling or sharing the information a police agency provided to the AI. These common-sense, first-step reforms are important: watchdogs are struggling to figure out where and how AI is being used in a police context. In fact, a popular AI police report writing tool, Axon’s Draft One, would be out of compliance with this bill, which would require them to redesign their tool to make it more transparent.  This bill is an important first step in regaining control over police using generative AI to write their narrative police reports.  Draft One takes audio from an officer’s body-worn camera, and uses AI  to turn that dialogue into a narrative police report. Because independent researchers have been unable to test it, there are important questions about how the system handles things like sarcasm, out of context comments, or interactions with members of the public that speak languages other than English. Another major concern is Draft One’s inability to keep track of which parts of a report were written by people and which parts were written by AI. By design, their product does not retain different iterations of the draft—making it easy for an officer to say, “I didn’t lie in my police report, the AI wrote that part.”  All lawmakers should pass regulations of AI written police reports. This technology could be nearly everywhere, and soon. Axon is a top supplier of body-worn cameras in the United States, which means they have a massive ready-made customer base. Through the bundling of products, AI-written police reports could be at a vast percentage of police departments.  AI-written police reports are unproven in terms of their accuracy, and their overall effects on the criminal justice system. Vendors still have a long way to go to prove this technology can be transparent and auditable. While it would not solve all of the many problems of AI encroaching on the criminal justice system, S.B. 524 is a good first step to rein in an unaccountable piece of technology.  We urge California lawmakers to pass S.B. 524. 

EFF Awards Spotlight ✨ Erie Meyer
Favicon 
www.eff.org

EFF Awards Spotlight ✨ Erie Meyer

In 1992 EFF presented our very first awards recognizing key leaders and organizations advancing innovation and championing civil liberties and human rights online. Now in 2025 we're continuing to celebrate the accomplishments of people working toward a better future for everyone with the EFF Awards!All are invited to attend the EFF Awards on Wednesday, September 10 at the San Francisco Design Center. Whether you're an activist, an EFF supporter, a student interested in cyberlaw, or someone who wants to munch on a strolling dinner with other likeminded individuals, anyone can enjoy the ceremony!REGISTER TODAY!GENERAL ADMISSION: $55 | CURRENT EFF MEMBERS: $45 | STUDENTS: $35If you're not able to make it, we'll also be hosting a livestream of the event on Friday, September 12 at 12:00 PM PT. The event will also be recorded, and posted to YouTube and the Internet Archive after the livestream.We are honored to present the three winners of this year's EFF Awards: Just Futures Law, Erie Meyer, and Software Freedom Law Center, India. But, before we kick off the ceremony next week, let's take a closer look at each of the honorees. This time—Erie Meyer, winner of the EFF Award for Protecting Americans' Data:Erie Meyer is a Senior Fellow at the Vanderbilt Policy Accelerator where she focuses on the intersection of technology, artificial intelligence, and regulation, and a Senior Fellow at the Georgetown Law Institute for Technology Law & Policy. Since January 20, Meyer has helped organize former government technologists to stand up for the privacy and integrity of governmental systems that hold Americans’ data. In addition to organizing others, she filed a declaration in federal court in February warning that 12 years of critical records could be irretrievably lost in the CFPB’s purge by the Trump Administration’s Department of Government Efficiency. In April, she filed a declaration in another case warning about using private-sector AI on government information. That same month, she testified to the House Oversight Subcommittee on Cybersecurity, Information Technology, and Government Innovation that DOGE is centralizing access to some of the most sensitive data the government holds—Social Security records, disability claims, even data tied to national security—without a clear plan or proper oversight, warning that “DOGE is burning the house down and calling it a renovation.” We're excited to celebrate Erie Meyer and the other EFF Award winners in person in San Francisco on September 10! We hope that you'll join us there.Thank you to Fastly, DuckDuckGo, Corellium, and No Starch Press for their year-round support of EFF's mission.Want to show your team’s support for EFF? Sponsorships ensure we can continue hosting events like this to build community among digital rights supporters. Please visit eff.org/thanks or contact tierney@eff.org for more information on corporate giving and sponsorships.EFF is dedicated to a harassment-free experience for everyone, and all participants are encouraged to view our full Event Expectations.Questions? Email us at events@eff.org.