DeepLinks from the EFF
DeepLinks from the EFF

DeepLinks from the EFF

@deeplinks

Data Brokers are Selling Your Flight Information to CBP and ICE
Favicon 
www.eff.org

Data Brokers are Selling Your Flight Information to CBP and ICE

For many years, data brokers have existed in the shadows, exploiting gaps in privacy laws to harvest our information—all for their own profit. They sell our precise movements without our knowledge or meaningful consent to a variety of private and state actors, including law enforcement agencies. And they show no sign of stopping. This incentivizes other bad actors. If companies collect any kind of personal data and want to make a quick buck, there’s a data broker willing to buy it and sell it to the highest bidder–often law enforcement and intelligence agencies. One recent investigation by 404 Media revealed that the Airlines Reporting Corporation (ARC), a data broker owned and operated by at least eight major U.S. airlines, including United Airlines and American Airlines, collected travelers’ domestic flight records and secretly sold access to U.S. Customs and Border Protection (CBP). Despite selling passengers’ names, full flight itineraries, and financial details, the data broker prevented U.S. border forces from revealing it as the origin of the information. So, not only is the government doing an end run around the Fourth Amendment to get information where they would otherwise need a warrant—they’ve also been trying to hide how they know these things about us.  ARC’s Travel Intelligence Program (TIP) aggregates passenger data and contains more than one billion records spanning 39 months of past and future travel by both U.S. and non-U.S. citizens. CBP, which sits within the U.S. Department of Homeland Security (DHS), claims it needs this data to support local and state police keeping track of people of interest. But at a time of growing concerns about increased immigration enforcement at U.S. ports of entry, including unjustified searches, law enforcement officials will use this additional surveillance tool to expand the web of suspicion to even larger numbers of innocent travelers.  More than 200 airlines settle tickets through ARC, with information on more than 54% of flights taken globally. ARC’s board of directors includes representatives from U.S. airlines like JetBlue and Delta, as well as international airlines like Lufthansa, Air France, and Air Canada.  In selling law enforcement agencies bulk access to such sensitive information, these airlines—through their data broker—are putting their own profits over travelers' privacy. U.S. Immigration and Customs Enforcement (ICE) recently detailed its own purchase of personal data from ARC. In the current climate, this can have a detrimental impact on people’s lives.  Movement unrestricted by governments is a hallmark of a free society. In our current moment, when the federal government is threatening legal consequences based on people’s national, religious, and political affiliations, having air travel in and out of the United States tracked by any ARC customer is a recipe for state retribution.  Sadly, data brokers are doing even broader harm to our privacy. Sensitive location data is harvested from smartphones and sold to cops, internet backbone data is sold to federal counterintelligence agencies, and utility databases containing phone, water, and electricity records are shared with ICE officers.  At a time when immigration authorities are eroding fundamental freedoms through increased—and arbitrary—actions at the U.S. border, this news further exacerbates concerns that creeping authoritarianism can be fueled by the extraction of our most personal data—all without our knowledge or consent. The new revelations about ARC’s data sales to CBP and ICE is a fresh reminder of the need for “privacy first” legislation that imposes consent and minimization limits on corporate processing of our data. We also need to pass the “Fourth Amendment is not for sale” act to stop police from bypassing judicial review of their data seizures by means of purchasing data from brokers. And let’s enforce data broker registration laws. 

Electronic Frontier Foundation to Present Annual EFF Awards to Just Futures Law, Erie Meyer, and Software Freedom Law Center, India
Favicon 
www.eff.org

Electronic Frontier Foundation to Present Annual EFF Awards to Just Futures Law, Erie Meyer, and Software Freedom Law Center, India

2025 Awards Will Be Presented in a Live Ceremony Wednesday, Sept. 10 in San Francisco SAN FRANCISCO—The Electronic Frontier Foundation (EFF) is honored to announce that Just Futures Law, Erie Meyer, and Software Freedom Law Center, India will receive the 2025 EFF Awards for their vital work in ensuring that technology supports privacy, freedom, justice, and innovation for all people.   The EFF Awards recognize specific and substantial technical, social, economic, or cultural contributions in diverse fields including journalism, art, digital access, legislation, tech development, and law.    The EFF Awards ceremony will start at 6 p.m. PT on Wednesday, Sept. 10, 2025 at the San Francisco Design Center Galleria, 101 Henry Adams St. in San Francisco. Guests can register at http://www.eff.org/effawards. The ceremony will be recorded and shared online on Sept. 12.  For the past 30 years, the EFF Awards—previously known as the Pioneer Awards—have recognized and honored key leaders in the fight for freedom and innovation online. Started when the internet was new, the Awards now reflect the fact that the online world has become both a necessity in modern life and a continually evolving set of tools for communication, organizing, creativity, and increasing human potential.  “Whether fighting the technological abuses that abet criminalization, detention, and deportation of immigrants and people of color, or working and speaking out fearlessly to protect Americans’ data privacy, or standing up for digital rights in the world’s most populous country, all of our 2025 Awards winners contribute to creating a brighter tech future for humankind,”  EFF Executive Director Cindy Cohn said. “We hope that this recognition will bring even more support for each of these vital efforts.”  Just Futures Law: Leading Immigration and Surveillance Litigation  jfl_icon_medium.png Just Futures Law is a women-of-color-led law project that recognizes how surveillance disproportionately impacts immigrants and people of color in the United States.  It uses litigation to fight back as part of defending and building the power of immigrant rights and criminal justice activists, organizers, and community groups to prevent criminalization, detention, and deportation of immigrants and people of color. Just Futures was founded in 2019 using a movement lawyering and racial justice framework and seeks to transform how litigation and legal support serves communities and builds movement power.   In the past year, Just Futures sued the Department of Homeland Security and its subagencies seeking a court order to compel the agencies to release records on their use of AI and other algorithms, and sued the Trump Administration for prematurely halting Haiti’s Temporary Protected Status, a humanitarian program that allows hundreds of thousands of Haitians to temporarily remain and work in the United States due to Haiti’s current conditions of extraordinary crises. It has represented activists in their fight against tech giants like Clearview AI, it has worked with Mijente to launch the TakeBackTech fellowship to train new advocates on grassroots-directed research, and it has worked with Grassroots Leadership to fight for the release of detained individuals under Operation Lone Star.  Erie Meyer: Protecting Americans' Privacy  eriemeyer.png Erie Meyer is a Senior Fellow at the Vanderbilt Policy Accelerator where she focuses on the intersection of technology, artificial intelligence, and regulation, and a Senior Fellow at the Georgetown Law Institute for Technology Law & Policy. She is former Chief Technologist at both the Consumer Financial Protection Bureau (CFPB) and the Federal Trade Commission. Earlier, she was senior advisor to the U.S. Chief Technology Officer at the White House, where she co-founded the United States Digital Service, a team of technologists and designers working to improve digital services for the public. Meyer also worked as senior director at Code for America, a nonprofit that promotes civic hacking to modernize government services, and in the Ohio Attorney General's office at the height of the financial crisis.  Since January 20, Meyer has helped organize former government technologists to stand up for the privacy and integrity of governmental systems that hold Americans’ data. In addition to organizing others, she filed a declaration in federal court in February warning that 12 years of critical records could be irretrievably lost in the CFPB’s purge by the Trump Administration’s Department of Government Efficiency. In April, she filed a declaration in another case warning about using private-sector AI on government information. That same month, she testified to the House Oversight Subcommittee on Cybersecurity, Information Technology, and Government Innovation that DOGE is centralizing access to some of the most sensitive data the government holds—Social Security records, disability claims, even data tied to national security—without a clear plan or proper oversight, warning that “DOGE is burning the house down and calling it a renovation.”  Software Freedom Law Center, India: Defending Digital Freedoms  sflc_logo.png Software Freedom Law Center, India is a donor-supported legal services organization based in India that brings together lawyers, policy analysts, students, and technologists to protect freedom in the digital world. It promotes innovation and open access to knowledge by helping developers make great free and open-source software, protects privacy and civil liberties for Indians by educating and providing free legal advice, and helps policymakers make informed and just decisions about use of technology.  Founded in 2010 by technology lawyer and online civil liberties activist Mishi Choudhary, SFLC.IN tracks and participates in litigation, AI regulations, and free speech issues that are defining Indian technology. It also tracks internet shutdowns and censorship incidents across India, provides digital security training, and has launched the Digital Defenders Network, a pan-Indian network of lawyers committed to protecting digital rights. It has conducted landmark litigation cases, petitioned the government of India on freedom of expression and internet issues, and campaigned for WhatsApp and Facebook to fix a feature of their platform that has been used to harass women in India.  To register for this event:  http://www.eff.org/effawards  For past honorees: https://www.eff.org/awards/past-winners 

EFF to US Court of Appeals: Protect Taxpayer Privacy
Favicon 
www.eff.org

EFF to US Court of Appeals: Protect Taxpayer Privacy

EFF has filed an amicus brief in Trabajadores v. Bessent, a case concerning the Internal Revenue Service (IRS) sharing protected personal tax information with the Department of Homeland Security for the purposes of immigration enforcement. Our expertise in  privacy and data sharing makes us the ideal organization to step in and inform the judge: government actions like this have real-world consequences. The IRS’s sharing, and especially bulk sharing, of data is improper and  makes taxpayers vulnerable to inevitable mistakes. As a practical matter, the sharing of data that IRS had previously claimed was protected undermines the trust important civil institutions require in order to be effective.  You can read the entire brief here.  The brief makes two particular arguments. The first is that if the Tax Reform Act, the statute under which the IRS found the authority to share the data, is considered to be ambiguous, and that the statute should be interpreted in light of the legislative intent and historical background, which disfavors disclosure. The brief reads, Given the historical context, and decades of subsequent agency promises to protect taxpayer confidentiality and taxpayer reliance on those promises, the Administration’s abrupt decision to re-interpret §6103 to allow sharing with ICE whenever a potential “criminal proceeding” can be posited, is a textbook example of an arbitrary and capricious action even if the statute can be read to be ambiguous. The other argument we make to the court is that data scientists agree: when you try to corroborate information between two databases in which information is only partially identifiable, mistakes happen. We argue: Those errors result from such mundane issues as outdated information, data entry errors, and taxpayers or tax preparer submission of incorrect names or addresses. If public reports are correct, and officials intend to share information regarding 700,000 or even 7 million taxpayers, the errors will multiply, leading to the mistaken targeting, detention, deportation, and potentially even physical harm to regular taxpayers. Information silos in the government exist for a reason. Here, it was designed to protect individual privacy and prevent executive abuse that can come with unfettered access to properly-collected information.  The concern motivating Congress to pass the Tax Reform Act was the same as that behind Privacy Act of 1974 and the 1978 Right to Financial Privacy Act. These laws were part of a wave of reforms Congress considered necessary to address the misuse of tax data to spy on and harass political opponents, dissidents, civil rights activists, and anti-war protestors in the 1960s and early 1970s. Congress saw the need to ensure that data collected for one purpose should only be used for that purpose, with very narrow exceptions, or else it is prone to abuse. Yet the IRS is currently sharing information to allow ICE to enforce immigration law. Taxation in the United States operates through a very simple agreement: the government requires taxes from people working inside the United States in order to function. In order to get people to pay their taxes, including undocumented immigrants living and working in the United States, the IRS has previously promised that the data they collect will not be used against a person for punitive reasons. This increases people to pay taxes and alleviates concerns of people people may have to avoid interacting with the government. But the IRS’s reversal has greatly harmed that trust and has potential to have far reaching and negative ramifications, including decreasing future tax revenue. Consolidating government information so that the agencies responsible for healthcare, taxes, or financial support are linked to agencies that police, surveil, and fine people is a recipe for disaster. For that reason, EFF is proud to submit this amicus brief in Trabajadores v. Bessent in support of taxpayer privacy.  Related Cases: American Federation of Government Employees v. U.S. Office of Personnel Management

How to Build on Washington’s “My Health, My Data” Act
Favicon 
www.eff.org

How to Build on Washington’s “My Health, My Data” Act

In 2023, the State of Washington enacted one of the strongest consumer data privacy laws in recent years: the “my health my data” act (HB 1155). EFF commends the civil rights, data privacy, and reproductive justice advocates who worked to pass this law. This post suggests ways for legislators and advocates in other states to build on the Washington law and draft one with even stronger protections. This post will separately address the law’s scope (such as who is protected); its safeguards (such as consent and minimization); and its enforcement (such as a private right of action). While the law only applies to one category of personal data – our health information – its structure could be used to protect all manner of data. Scope of Protection Authors of every consumer data privacy law must make three decisions about scope: What kind of data is protected? Whose data is protected? And who is regulated? The Washington law protects “consumer health data,” defined as information linkable to a consumer that identifies their “physical or mental health status.” This includes all manner of conditions and treatments, such as gender-affirming and reproductive care. While EFF’s ultimate goal is protection of all types of personal information, bills that protect at least some types can be a great start. The Washington law protects “consumers,” defined as all natural persons who reside in the state or had their health data collected there. It is best, as here, to protect all people. If a data privacy law protects just some people, that can incentivize a regulated entity to collect even more data, in order to distinguish protected from unprotected people. Notably, Washington’s definition of “consumers” applies only in “an individual or household context,” but not “an employment context”; thus, Washingtonians will need a different health privacy law to protect them from their snooping bosses. The Washington law defines a “regulated entity” as “any legal entity” that both: “conducts business” in the state or targets residents for products or services; and “determines the purpose and means” of processing consumer health data. This appears to include many non-profit groups, which is good, because such groups can harmfully process a lot of personal data. The law excludes government from regulation, which is not unusual for data privacy bills focused on non-governmental actors. State and local government will likely need to be regulated by another data privacy law. Unfortunately, the Washington law also excludes “contracted service providers when processing data on behalf of government.” A data broker or other surveillance-oriented business should not be free from regulation just because it is working for the police. Consent or Minimization to Collect or Share Health Data The most important part of Washington’s law requires either consent or minimization for a regulated entity to collect or share a consumer’s health data. The law has a strong definition of “consent.” It must be “a clear affirmative act that signifies a consumer’s freely given, specific, informed, opt-in, voluntary, and unambiguous agreement.” Consent cannot be obtained with “broad terms of use” or “deceptive design.” Absent consent, a regulated entity cannot collect or share a consumer’s health data except as necessary to provide a good or service that the consumer requested. Such rules are often called “data minimization.” Their virtue is that a consumer does not need to do anything to enjoy their statutory privacy rights; the burden is on the regulated entity to process less data. As to data “sale,” the Washington law requires enhanced consent (which the law calls “valid authorization”). Sale is the most dangerous form of sharing, because it incentivizes businesses to collect the most possible data in hopes of later selling it. For this reason, some laws flatly ban sale of sensitive data, like the Illinois biometric information privacy act (BIPA). For context, there are four ways for a bill or law to configure consent and/or minimization. Some require just consent, like BIPA’s provisions on data collection. Others require just minimization, like the federal “my body my data” bill. Still others require both, like the Massachusetts location data privacy bill. And some require either one or the other. In various times and places, EFF has supported all four configurations. “Either/or” is weakest, because it allows regulated entities to choose whether to minimize or to seek consent – a choice they will make based on their profit and not our privacy. Two Protections of Location Data Privacy Data brokers harvest our location information and sell it to anyone who will pay, including advertisers, police, and other adversaries. Legislators are stepping forward to address this threat. The Washington law does so in two ways. First, the “consumer health data” protected by the consent-or-minimization rule is defined to include “precise location information that could reasonably indicate a consumer’s attempt to acquire or receive health services or supplies.” In turn, “precise location” is defined as within 1,750’ of a person. Second, the Washington law bans a “geofence” around an “in-person health care service,” if “used” for one of three forbidden purposes (to track consumers, to collect their data, or to send them messages or ads). A “geofence” is defined as technology that uses GPS or the like “to establish a virtual boundary” of 2,000’ around the perimeter of a physical location. This is a good start. It is also much better than weaker rules that only apply to the immediate vicinity of sensitive locations. Such rules allow adversaries to use location data to track us as we move towards sensitive locations, observe us enter the small no-data bubble around those locations, and infer what we may have done there. On the other hand, Washington’s rules apply to sizeable areas. Also, its consent-or-minimization rule applies to all locations that could indicate pursuit of health care (not just health facilities). And its geofence rule forbids use of location data to track people. Still, the better approach, as in several recent bills, is to simply protect all location data. Protecting just one kind of sensitive location, like houses of worship, will leave out others, like courthouses. More fundamentally, all locations are sensitive, given the risk that others will use our location data to determine where – and with whom – we live, work, and socialize. More Data Privacy Protections Other safeguards in the Washington law deserve attention from legislators in other states: Regulated entities must publish a privacy policy that discloses, for example, the categories of data collected and shared, and the purposes of collection. Regulated entities must not collect, use, or share additional categories of data, or process them for additional purposes, without consent. Regulated entities must provide consumers the rights to access and delete their data. Regulated entities must restrict data access to just those employees who need it, and maintain industry-standard data security Enforcement A law is only as strong as its teeth. The best way to ensure enforcement is to empower people to sue regulated entities that violate their privacy; this is often called a “private right of action.” The Washington law provides that its violation is “an unfair or deceptive act” under the state’s separate consumer protection act. That law, in turn, bans unfair or deceptive acts in the conduct of trade or commerce. Upon a violation of the ban, that law provides a civil action to “any person who is injured in [their] business or property,” with the remedies of injunction, actual damages, treble damages up to $25,000, and legal fees and costs. It remains to be seen how Washington’s courts will apply this old civil action to the new “my health my data” act. Washington legislators are demonstrating that privacy is important to public policy, but a more explicit claim would be cleaner: invasion of the fundamental human right to data privacy. Sadly, there is a nationwide debate about whether injury to data privacy, by itself, should be enough to go to court, without also proving a more tangible injury like identity theft. The best legislative models ensure full access to the courts in two ways. First, they provide: “A violation of this law regarding an individual’s data constitutes an injury to that individual, and any individual alleging a violation of this law may bring a civil action.” Second, they provide a baseline amount of damages (often called “liquidated” or “statutory” damages), because it is often difficult to prove actual damages arising from a data privacy injury. Finally, data privacy laws must protect people from “pay for privacy” schemes, where a business charges a higher price or delivers an inferior product if a consumer exercises their statutory data privacy rights. Such schemes will lead to a society of privacy “haves” and “have nots.” The Washington law has two helpful provisions. First, a regulated entity “may not unlawfully discriminate against a consumer for exercising any rights included in this chapter.” Second, there can be no data sale without a “statement” from the regulated entity to the consumer that “the provision of goods or services may not be conditioned on the consumer signing the valid authorization.” Some privacy bills contain more-specific language, for example along these lines: “a regulated entity cannot take an adverse action against a consumer (such as refusal to provide a good or service, charging a higher price, or providing a lower quality) because the consumer exercised their data privacy rights, unless the data at issue is essential to the good or service they requested and then only to the extent the data is essential.” What About Congress? We still desperately need comprehensive federal consumer data privacy law built on “privacy first” principles. In the meantime, states are taking the lead. The very worst thing Congress could do now is preempt states from protecting their residents’ data privacy. Advocates and legislators from across the country, seeking to take up this mantle, would benefit from looking at – and building on – Washington’s “my health my data” law.

? Meta's Secret Spying Scheme | EFFector 37.7
Favicon 
www.eff.org

? Meta's Secret Spying Scheme | EFFector 37.7

Keeping up on the latest digital rights news has never been easier. With a new look, EFF's EFFector newsletter covers the latest details on our work defending your rights to privacy and free expression online. EFFector 37.7 covers some of the very sneaky tactics that Meta has been using to track you online, and how you can mitigate some of this tracking. In this issue, we're also explaining the legal processes police use to obtain your private online data, and providing an update on the NO FAKES Act—a U.S. Senate bill that takes a flawed approach to concerns about AI-generated "replicas."  And, in case you missed it in the previous newsletter, we're debuting a new audio companion to EFFector as well! This time, Lena Cohen breaks down the ways that Meta tracks you online and what you—and lawmakers—can do to prevent that tracking. You can listen now on YouTube or the Internet Archive. Listen TO EFFECTOR EFFECTOR 37.7 - META'S SECRET SPYING SCHEME Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression.  Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.