EFF: Updates

Subscribe to EFF: Updates feed
EFF's Deeplinks Blog: Noteworthy news from around the internet
Updated: 3 hours 25 sec ago

Site Blocking Laws Will Always Be a Bad Idea: 2025 in Review

Tue, 12/30/2025 - 4:46pm

This year, we fought back against the return of a terrible idea that hasn’t improved with age: site blocking laws. 

More than a decade ago, Congress tried to pass SOPA and PIPA—two sweeping bills that would have allowed the government and copyright holders to quickly shut down entire websites based on allegations of piracy. The backlash was massive. Internet users, free speech advocates, and tech companies flooded lawmakers with protests, culminating in an “Internet Blackout” on January 18, 2012. Turns out, Americans don’t like government-run internet blacklists. The bills were ultimately shelved.  

But we’ve never believed they were gone for good. The major media and entertainment companies that backed site blocking in the US in 2012 turned to pushing for site-blocking laws in other countries. Rightsholders continued to ask US courts for site-blocking orders, often winning them without a new law. And sure enough, the Motion Picture Association (MPA) and its allies have asked Congress to try again. 

There were no less than three Congressional drafts of site-blocking legislation. Representative Zoe Lofgren kicked off the year with the Foreign Anti-Digital Piracy Act (FADPA). Fellow House of Representatives member Darrell Issa also claimed to be working on a bill that would make it offensively easy for a studio to block your access to a website based solely on the belief that there is infringement happening. Not to be left out, the Senate Judiciary Committee produced the terribly named Block BEARD Act.  

None of these three attempts to fundamentally alter the way you experience the internet moved too far after their press releases. But the number tells us that there is, once again, an appetite among major media conglomerates and politicians to resurrect SOPA/PIPA from the dead.  

None of these proposals fixes the flaws of SOPA/PIPA, and none ever could. Site blocking is a flawed idea and a disaster for free expression that no amount of rewriting will fix. There is no way to create a fast lane for removing your access to a website that is not a major threat to the open web. Just as we opposed SOPA/PIPA over ten years ago, we oppose these efforts.  

Site blocking bills seek to build a new infrastructure of censorship into the heart of the internet. They would enable court orders directed to the organizations that make the internet work, like internet service providers, domain name resolvers, and reverse proxy services, compelling them to help block US internet users from visiting websites accused of copyright infringement. The technical means haven’t changed much since 2012. - tThey involve blocking Internet Protocol addresses or domain names of websites. These methods are blunt—sledgehammers rather than scalpels. Today, many websites are hosted on cloud infrastructure or use shared IP addresses. Blocking one target can mean blocking thousands of unrelated sites. That kind of digital collateral damage has already happened in Austria, Italy, South Korea, France, and in the US, to name just a few.  

Given this downside, one would think the benefits of copyright enforcement from these bills ought to be significant. But site blocking is trivially easy to evade. Determined site owners can create the same content on a new domain within hours. Users who want to see blocked content can fire up a VPN or change a single DNS setting to get back online.  

The limits that lawmakers have proposed to put on these laws are an illusion. While ostensibly aimed at “foreign” websites, they sweep in any website that doesn’t conspicuously display a US origin, putting anonymity at risk. And despite the rhetoric of MPA and others that new laws would be used only by responsible companies against the largest criminal syndicates, laws don’t work that way. Massive new censorship powers invite abuse by opportunists large and small, and the costs to the economy, security, and free expression are widely borne. 

It’s time for Big Media and its friends in Congress to drop this flawed idea. But as long as they keep bringing it up, we’ll keep on rallying internet users of all stripes to fight it. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

EFF's Investigations Expose Flock Safety's Surveillance Abuses: 2025 in Review

Tue, 12/30/2025 - 2:03pm

Throughout 2025, EFF conducted groundbreaking investigations into Flock Safety's automated license plate reader (ALPR) network, revealing a system designed to enable mass surveillance and susceptible to grave abuses. Our research sparked state and federal investigations, drove landmark litigation, and exposed dangerous expansion into always-listening voice detection technology. We documented how Flock's surveillance infrastructure allowed law enforcement to track protesters exercising their First Amendment rights, target Romani people with discriminatory searches, and surveil women seeking reproductive healthcare.

Flock Enables Surveillance of Protesters

When we obtained datasets representing more than 12 million searches logged by more than 3,900 agencies between December 2024 and October 2025, the patterns were unmistakable. Agencies logged hundreds of searches related to political demonstrations—the 50501 protests in February, Hands Off protests in April, and No Kings protests in June and October. Nineteen agencies conducted dozens of searches specifically tied to No Kings protests alone. Sometimes searches explicitly referenced protest activity; other times, agencies used vague terminology to obscure surveillance of constitutionally protected speech.

The surveillance extended beyond mass demonstrations. Three agencies used Flock's system to target activists from Direct Action Everywhere, an animal-rights organization using civil disobedience to expose factory farm conditions. Delaware State Police queried the Flock network nine times in March 2025 related to Direct Action Everywhere actions—showing how ALPR surveillance targets groups engaged in activism challenging powerful industries.

Biased Policing and Discriminatory Searches

Our November analysis revealed deeply troubling patterns: more than 80 law enforcement agencies used language perpetuating harmful stereotypes against Romani people when searching the nationwide Flock Safety ALPR network. Between June 2024 and October 2025, police performed hundreds of searches using terms such as "roma" and racial slurs—often without mentioning any suspected crime.

Audit logs revealed searches including "roma traveler," "possible g*psy," and "g*psy ruse." Grand Prairie Police Department in Texas searched for the slur six times while using Flock's "Convoy" feature, which identifies vehicles traveling together—essentially targeting an entire traveling community without specifying any crime. According to a 2020 Harvard University survey, four out of 10 Romani Americans reported being subjected to racial profiling by police. Flock's system makes such discrimination faster and easier to execute at scale.

Weaponizing Surveillance Against Reproductive Rights

In October, we obtained documents showing that Texas deputies queried Flock Safety's surveillance data in what police characterized as a missing person investigation, but was actually an abortion case. Deputies initiated a "death investigation" of a "non-viable fetus," logged evidence of a woman's self-managed abortion, and consulted prosecutors about possible charges.

A Johnson County official ran two searches with the note "had an abortion, search for female." The second search probed 6,809 networks, accessing 83,345 cameras across nearly the entire country. This case revealed Flock's fundamental danger: a single query accesses more than 83,000 cameras spanning almost the entire nation, with minimal oversight and maximum potential for abuse—particularly when weaponized against people seeking reproductive healthcare.

Feature Updates Miss the Point

In June, EFF explained why Flock Safety's announced feature updates cannot make ALPRs safe. The company promised privacy-enhancing features like geofencing and retention limits in response to public pressure. But these tweaks don't address the core problem: Flock's business model depends on building a nationwide, interconnected surveillance network that creates risks no software update can eliminate. Our 2025 investigations proved that abuses stem from the architecture itself, not just how individual agencies use the technology.

Accountability and Community Action

EFF's work sparked significant accountability measures. U.S. Rep. Raja Krishnamoorthi and Rep. Robert Garcia launched a formal investigation into Flock's role in "enabling invasive surveillance practices that threaten the privacy, safety, and civil liberties of women, immigrants, and other vulnerable Americans."

Illinois Secretary of State Alexi Giannoulias launched an audit after EFF research showed Flock allowed U.S. Customs and Border Protection to access Illinois data in violation of state privacy laws. In November, EFF partnered with the ACLU of Northern California to file a lawsuit against San Jose and its police department, challenging warrantless searches of millions of ALPR records. Between June 5, 2024 and June 17, 2025, SJPD and other California law enforcement agencies searched San Jose's database 3,965,519 times—a staggering figure illustrating the vast scope of warrantless surveillance enabled by Flock's infrastructure.

Our investigations also fueled municipal resistance to Flock Safety. Communities from Austin to Evanston to Eugene successfully canceled or refused to renew their Flock contracts after organizing campaigns centered on our research documenting discriminatory policing, immigration enforcement, threats to reproductive rights, and chilling effects on protest. These victories demonstrate that communities—armed with evidence of Flock's harms—can challenge and reject surveillance infrastructure that threatens civil liberties.

Dangerous New Capabilities: Always-Listening Microphones

In October 2025, Flock announced plans to expand its gunshot detection microphones to listen for "human distress" including screaming. This dangerous expansion transforms audio sensors into powerful surveillance tools monitoring human voices on city streets. High-powered microphones above densely populated areas raise serious questions about wiretapping laws, false alerts, and potential for dangerous police responses to non-emergencies. After EFF exposed this feature, Flock quietly amended its marketing materials to remove explicit references to "screaming"—replacing them with vaguer language about "distress" detection—while continuing to develop and deploy the technology.

Looking Forward

Flock Safety's surveillance infrastructure is not a neutral public safety tool. It's a system that enables and amplifies racist policing, threatens reproductive rights, and chills constitutionally protected speech. Our 2025 investigations proved it beyond doubt. As we head into 2026, EFF will continue exposing these abuses, supporting communities fighting back, and litigating for the constitutional protections that surveillance technology has stripped away.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Fighting Renewed Attempts to Make ISPs Copyright Cops: 2025 in Review

Tue, 12/30/2025 - 12:39pm

You might not know it, given the many headlines focused on new questions about copyright and Generative AI, but the year’s biggest copyright case concerned an old-for-the-internet question: do ISPs have to be copyright cops? After years of litigation, that question is now squarely before the Supreme Court. And if the Supreme Court doesn’t reverse a lower court’s ruling, ISPs could be forced to terminate people’s internet access based on nothing more than mere accusations of copyright infringement. This would threaten innocent users who rely on broadband for essential aspects of daily life.

The Stakes: Turning ISPs into Copyright Police

This issue turns on what courts call “secondary liability,” which is the legal idea that someone can be held responsible not for what they did directly, but for what someone else did using their product or service. The case began when music companies sued Cox Communications, arguing that the ISP should be held liable for copyright infringement committed by some of its subscribers. The Court of Appeals for the Fourth Circuit agreed, adopting a “material contribution” standard for contributory copyright liability (a rule for when service providers can be held liable for the actions of users). Under that standard, providing a service that could be used for infringement is enough to create liability when a customer infringes.

The Fourth Circuit’s rule would have devastating consequences for the public. Given copyright law’s draconian penalties, ISP would be under enormous pressure to terminate accounts whenever they get an infringement notice, whether or not the actual accountholder has infringed anything: entire households, schools, libraries, or businesses that share an internet connection. These would include:

  • Public libraries, which provide internet access to millions of Americans who lack it at home, could lose essential service.
  • Universities, hospitals, and local governments could see internet access for whole communities disrupted.
  • Households—especially in low-income and communities of color, which disproportionately share broadband connections with other people—would face collective punishment for the alleged actions of a single user.

And with more than a third of Americans having only one or no broadband provider, many users would have no way to reconnect.

EFF—along with the American Library Association, the Association of Research Libraries, and Re:Create—filed an amicus brief urging the Court to reverse the Fourth Circuit’s decision, taking guidance from patent law. In the Patent Act, where Congress has explicitly defined secondary liability, there’s a different test: contributory infringement exists only where a product is incapable of substantial non-infringing use. Internet access, of course, is overwhelmingly used for lawful purposes, making it the very definition of a “staple article of commerce” that can’t be liable under the patent framework.

The Supreme Court held a hearing in the case on December 1, and a majority of the justices seemed troubled by the implications of the Fourth Circuit’s ruling. One exchange was particularly telling: asked what should happen when the notices of infringement target a university account upon which thousands of people rely, Sony’s counsel suggested the university could resolve the issue by essentially slowing internet speeds so infringement might be less appealing. It’s hard to imagine the university community would agree that research, teaching, artmaking, library services, and the myriad other activities that rely on internet access should be throttled because of the actions of a few students. Hopefully the Supreme Court won’t either.

We expect a ruling in the case in the next few months. Fingers crossed that the Court rejects the Fourth Circuit’s draconian rule.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Operations Security (OPSEC) Trainings: 2025 in Review

Mon, 12/29/2025 - 11:34am

It's no secret that digital surveillance and other tech-enabled oppressions are acute dangers for liberation movement workers. The rising tides of tech-fueled authoritarianism and hyper-surveillance are universal themes across the various threat models we consider. EFF's Surveillance Self-Defense project is a vital antidote to these threats, but it's not all we do to help others address these concerns. Our team often receives questions, requests for security trainings, presentations on our research, and asks for general OPSEC (operations security, or, the process of applying digital privacy and information security strategies to a current workflow or process) advising. This year stood out for the sheer number and urgency of requests we fielded. 

Combining efforts across our Public Interest Technology and Activism teams, we consulted with an estimated 66 groups and organizations, with at least 2000 participants attending those sessions. These engagements typically look like OPSEC advising and training, usually merging aspects of threat modeling, cybersecurity 101, secure communications practices, doxxing self-defense, and more. The groups we work with are often focused on issue-spaces that are particularly embattled at the current moment, such as abortion access, advocacy for transgender rights, and climate justice. 

Our ability to offer realistic and community-focused OPSEC advice for these liberation movement workers is something we take great pride in. These groups are often under-resourced and unable to afford typical infosec consulting. Even if they could, traditional information security firms are designed to protect corporate infrastructure, not grassroots activism. Offering this assistance also allows us to stress-test the advice given in the aforementioned Surveillance Self-Defense project with real-world experience and update it when necessary. What we learn from these sessions also informs our blog posts, such as this piece on strategies for overcoming tech-enabled violence for transgender people, and this one surveying the landscape of digital threats in the abortion access movement post-Roe

There is still much to be done. Maintaining effective privacy and security within one's work is an ongoing process. We are grateful to be included in the OPSEC process planning for so many other human-rights defenders and activists, and we look forward to continuing this work in the coming years. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

EFF in the Press: 2025 in Review

Mon, 12/29/2025 - 11:34am

EFF’s attorneys, activists, and technologists don’t just do the hard, endless work of defending our digital civil liberties — they also spend a lot of time and effort explaining that work to the public via media interviews. 

EFF had thousands of media mentions in 2025, from the smallest hyperlocal outlets to international news behemoths. Our work on street-level surveillance — the technology that police use to spy on our communities — generated a great deal of press attention, particularly regarding automated license plate readers (ALPRs). But we also got a lot of ink and airtime for our three lawsuits against the federal government: one challenging the U.S. Office of Personnel Management's illegal data sharing, a second challenging the State Department's unconstitutional "catch and revoke" program, and the third demanding that the departments of State and Justice reveal what pressure they put on app stores to remove ICE-tracking apps.

Other hot media topics included how travelers can protect themselves against searches of their devices, how protestors can protect themselves from surveillance, and the misguided age-verification laws that are proliferating across the nation and around the world, which are an attack on privacy and free expression.

On national television, Matthew Guariglia spoke with NBC Nightly News to discuss how more and more police agencies are using private doorbell cameras to surveil neighborhoods. Tori Noble spoke with ABC’s Good Morning America about the dangers of digital price tags, as well as with ABC News Live Prime about privacy concerns over OpenAI’s new web browser.

%3Ciframe%20width%3D%22560%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FUrFD-JVHmp4%3Fsi%3DuaW-nxW8o3jt5WJU%26autoplay%3D1%26mute%3D1%22%20title%3D%22YouTube%20video%20player%22%20frameborder%3D%220%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%3B%20web-share%22%20referrerpolicy%3D%22strict-origin-when-cross-origin%22%20allowfullscreen%3D%22%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com
%3Ciframe%20width%3D%22560%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2F1hEgPLRmgxo%3Fsi%3DwQsxQIqqjUnSI9qm%26autoplay%3D1%26mute%3D1%22%20title%3D%22YouTube%20video%20player%22%20frameborder%3D%220%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%3B%20web-share%22%20referrerpolicy%3D%22strict-origin-when-cross-origin%22%20allowfullscreen%3D%22%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube.com

 

In a sampling of mainstream national media, EFF was cited 33 times by the Washington Post, 16 times by CNN, 13 times by USA Today, 12 times by the Associated Press, 11 times by NBC News, 11 times by the New York Times, 10 times by Reuters, and eight times by National Public Radio. Among tech and legal media, EFF was cited 74 times by Privacy Daily, 35 times by The Verge, 32 times by 404 Media, 32 times by The Register, 26 times by Ars Technica, 25 times by WIRED, 21 times by Law360, 21 times by TechCrunch, 20 times by Gizmodo, and 14 times by Bloomberg Law.

Abroad, EFF was cited in coverage by media outlets in nations including Australia, Bangladesh, Belgium, Canada, Colombia, El Salvador, France, Germany, India, Ireland, New Zealand, Palestine, the Philippines, Slovakia, South Africa, Spain, Trinidad and Tobago, the United Arab Emirates, and the United Kingdom. 

EFF staffers spoke to the masses in their own words via op-eds such as: 

And we ruled the airwaves on podcasts including: 

We're grateful to all the intrepid journalists who keep doing the hard work of reporting accurately on tech and privacy policy, and we encourage them to keep reaching out to us at press@eff.org.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Drone as First Responder Programs: 2025 in Review

Mon, 12/29/2025 - 11:33am

Drone as first responder (DFR) adoption really took off in 2025. Though the concept has been around since 2018, this year saw more normalization of the technology, its integration into more real-time crime center structures, and the implementation of automated deployment of drones.

A DFR program features a fleet of camera-equipped drones, which can range from just a couple to dozens or more. These are deployed from a launch pad in response to 911 calls and other calls for service, sometimes operated by a drone pilot or, increasingly, autonomously directed to the call location. The appeal is the promise of increased “situational awareness” for officers headed to a call. This video offers a short explanation of DFR, and for a list of all of the cities we know use drones, including DFR programs, check out EFF’s Atlas of Surveillance

Major Moves from the FAA and Forthcoming Federal Issues

In order to deploy a drone beyond where it can be seen, operators need to receive a waiver from the Federal Aviation Administration (FAA), and all DFR programs require this. Police departments and technology vendors have complained that the process takes too long, and in May, FAA finalized reworked requirements, leading to a flood of waiver requests. An FAA spokesperson reported that in the first two months of the new waiver process, it had approved 410 such waivers, already accounting for almost a third of the approximately 1,400 DFR waivers that had ever been granted.

The federal government made other major moves on the drone front this year. A month after the new waivers went to effect, President Trump issued an Executive Order with aspirations for advancing the country’s drone industry. And at the end of the year, one of the largest drone manufacturers in the world and one of the biggest purveyors of law enforcement drones, DJI, will be banned from launching new products in the U.S. unless the federal government conducts a security audit that was mandated by the National Defense Authorization Act. However, at the moment, it doesn’t seem like that audit will happen, and if it doesn’t, it won’t be surprising to see other drone manufacturers leveraging the ban to boost their own products. 

Automated Drone Deployment and Tech Integrations

Early iterations of drone use required a human operator, but this year, police drone companies began releasing automated flying machines that don’t require much human intervention at all. New models can rely on AI and automated directions to launch and direct a drone. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

This was the year we saw DFR integrated with other tools and tech companies teamed up to bring even more powerful surveillance. Flock Safety added automated license plate readers (ALPR) to their drones. Axon and Skydio built on the partnership they launched in 2024. Drone manufacturer Brinc teamed up with Motorola Solutions on a DFR program. Drone company Paladin teamed up with a company called SkyeBrowse to add 3-D mapping of the environment to their list of features. 

DFR also is increasingly part of the police plans for real-time crime centers, meaning that the footage being captured by these flying cameras is being integrated into other streams and analyzed in ways that we’re still learning about. 

Transparency Around DFR Deployments

Transparency around adoption, use, and oversight is always crucial, particularly when it comes to police surveillance, and EFF has been tracking the growth of DFR programs across the country. We encourage you to use your local public records laws to investigate them further. Examples of the kinds of requests and the responsive documents people have already received — including flight logs, policies, and other information — can be found on MuckRock

The Problem with Drones

Flying cameras are bad enough. They can see and record footage from a special vantage point, capturing video of your home, your backyard, and your movements that should require clear policies around retention, audits, and use, including when the cameras shouldn’t be recording. We’re also seeing that additional camera analysis and other physical features that can be added (so-called “payloads”) — like thermal cameras and even tear gas — can make drones even more powerful and that police technology companies are encouraging DFR as part of surveillance packages.

It's important that next year we all advocate for, and enforce, standards in adopting and using these DFRs. Check the Atlas to see if they are used where you live and learn more about drones and other surveillance tools on EFF’s Street-Level Surveillance Hub.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

EFFector Audio Speaks Up for Our Rights: 2025 Year in Review

Sun, 12/28/2025 - 5:57pm

This year, you may have heard EFF sounding off about our civil liberties on NPR, BBC Radio, or any number of podcasts. But we also started sharing our voices directly with listeners in 2025. In June, we revamped EFFector, our long-running electronic newsletter, and launched a new audio edition to accompany it.

Providing a recap of the week's most important digital rights news, EFFector's audio companion features exclusive interviews where EFF's lawyers, activists, and technologists can dig deeper into the biggest stories in privacy, free speech, and innovation. Here are just some of the best interviews from EFFector Audio in 2025.

Unpacking a Social Media Spying Scheme

Earlier this year, the Trump administration launched a sprawling surveillance program to spy on the social media activity of millions of noncitizens—and punish those who express views it doesn't like. This fall, EFF's Lisa Femia came onto EFFector Audio to explain how this scheme works, its impact on free speech, and, importantly, why EFF is suing to stop it.

"We think all of this is coming together as a way to chill people's speech and make it so they do not feel comfortable expressing core political viewpoints protected by the First Amendment," Femia said.


Challenging the Mass Surveillance of Drivers

But Lisa was hardly the only guest talking about surveillance. In November, EFF's Andrew Crocker spoke to EFFector about Automated License Plate Readers (ALPRs), a particularly invasive and widespread form of surveillance. ALPR camera networks take pictures of every passing vehicle and upload the location information of millions of drivers into central databases. Police can then search these databases—typically without any judicial approval—to instantly reconstruct driver movements over weeks, months, or even years at a time.

"It really is going to be a very detailed picture of your habits over the course of a long period of time," said Crocker, explaining how ALPR location data can reveal where you work, worship, and many other intimate details about your life. Crocker also talked about a new lawsuit, filed by two nonprofits represented by EFF and the ACLU of Northern California, challenging the city of San Jose's use of ALPR searches without a warrant.

Similarly, EFF's Mario Trujillo joined EFFector in early November to discuss the legal issues and mass surveillance risks around face recognition in consumer devices.

Simple Tips to Take Control of Your Privacy

Online privacy isn’t dead. But tech giants have tried to make protecting it as annoying as possible. To help users take back control, we celebrated Opt Out October, sharing daily privacy tips all month long on our blog. In addition to laying down some privacy basics, EFF's Thorin Klosowski talked to EFFector about how small steps to protect your data can build up into big differences.

"This is a way to kind of break it down into small tasks that you can do every day and accomplish a lot," said Klosowski. "By the end of it, you will have taken back a considerable amount of your privacy."

User privacy was the focus of a number of EFFector interviews. In July, EFF's Lena Cohen spoke about what lawmakers, tech companies, and individuals can do to fight online tracking. That same month, Matthew Guariglia talked about precautions consumers can take before bringing surveillance devices like smart doorbells into their homes.

Digging Into the Next Wave of Internet Censorship

One of the most troubling trends of 2025 was the proliferation of age verification laws, which require online services to check, estimate, or verify users’ ages. Though these mandates claim to protect children, they ultimately create harmful censorship and surveillance regimes that put everyone—adults and young people alike—at risk.

This summer, EFF's Rin Alajaji came onto EFFector Audio to explain how these laws work and why we need to speak out against them.

"Every person listening here can push back against these laws that expand censorship," she said. "We like to say that if you care about internet freedom, this fight is yours."

This was just one of several interviews about free speech online. This year, EFFector also hosted Paige Collings to talk about the chaotic rollout of the UK's Online Safety Act and Lisa Femia (again!) to discuss the abortion censorship crisis on social media.

You can hear all these episodes and future installments of EFFector's audio companion on YouTube or the Internet Archive. Or check out our revamped EFFector newsletter by subscribing at eff.org/effector!

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Procurement Power—When Cities Realized They Can Just Say No: 2025 in Review

Sun, 12/28/2025 - 2:22pm

In 2025, elected officials across the country began treating surveillance technology purchases differently: not as inevitable administrative procurements handled by police departments, but as political decisions subject to council oversight and constituent pressure. This shift proved to be the most effective anti-surveillance strategy of the year.

Since February, at least 23 jurisdictions fully ended, cancelled, or rejected Flock Safety ALPR programs (including Austin, Oak Park, Evanston, Hays County, San Marcos, Eugene, Springfield, and Denver) by recognizing surveillance procurement as political power, not administrative routine.

Legacy Practices & Obfuscation

For decades, cities have been caught in what researchers call "legacy procurement practices": administrative norms that prioritize "efficiency" and "cost thresholds" over democratic review. 

Vendors exploit this inertia through the "pilot loophole." As Taraaz and the Collaborative Research Center for Resilience (CRCR) note in a recent report, "no-cost offers" and free trials allow police departments to bypass formal procurement channels entirely. By the time the bill comes due, the surveillance is already normalised in the community, turning a purchase decision into a "continuation of service" that is politically difficult to stop.

This bureaucracy obscures the power that surveillance vendors have over municipal procurement decisions. As Arti Walker-Peddakotla details, this is a deliberate strategy. Walker-Peddakotla details how vendors secure "acquiescence" by hiding the political nature of surveillance behind administrative veils: framing tools as "force multipliers" and burying contracts in consent agendas. For local electeds, the pressure to "outsource" government decision-making makes vendor marketing compelling. Vendors use "cooperative purchasing" agreements to bypass competitive bidding, effectively privatizing the policy-making process. 

The result is a dangerous "information asymmetry" where cities become dependent on vendors for critical data governance decisions. The 2025 cancellations finally broke that dynamic.

The Procurement Moment

This year, cities stopped accepting this "administrative" frame. The shift came from three converging forces: audit findings that exposed Flock's lack of safeguards, growing community organizing pressure, and elected officials finally recognizing that saying "no" to a renewal was not just an option—it was the responsible choice.

When Austin let its Flock pilot expire on July 1, the decision reflected a political judgment: constituents rejected a nationwide network used for immigration enforcement. It wasn't a debate about retention rates; it was a refusal to renew.

These cancellations were also acts of fiscal stewardship. By demanding evidence of efficacy (and receiving none) officials in Hays County, Texas and San Marcos, Texas rejected the "force multiplier" myth. They treated the refusal of unproven technology not just as activism, but as a basic fiduciary duty. In Oak Park, Illinois, trustees cancelled eight cameras after an audit found Flock lacked safeguards, while Evanston terminated its 19-camera network shortly after. Eugene and Springfield, Oregon terminated 82 combined cameras in December. City electeds have also realized that every renewal is a vote for "vendor lock-in." As EPIC warns, once proprietary systems are entrenched, cities lose ownership of their own public safety data, making it nearly impossible to switch providers or enforce transparency later.

The shift was not universal. Denver illustrated the tension when Mayor Mike Johnston overrode a unanimous council rejection to extend Flock's contract. Council Member Sarah Parady rightly identified this as "mass surveillance" imposed "with no public process." This is exactly why procurement must be reclaimed: when treated as technical, surveillance vendors control the conversation; when recognized as political, constituents gain leverage.

Cities Hold the Line Against Mass Surveillance

EFF has spent years documenting how procurement functions as a lever for surveillance expansion, from our work documenting Flock Safety's troubling data-sharing practices with ICE and federal law enforcement to our broader advocacy on surveillance technology procurement reform. The 2025 victories show that when cities understand procurement as political rather than technical, they can say no. Procurement power can be the most direct route to stopping mass surveillance. 

As cities move into 2026, the lesson is clear: surveillance is a choice, not a mandate, and your community has the power to refuse it. The question isn't whether technology can police more effectively; it's whether your community wants to be policed this way. That decision belongs to constituents, not vendors.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Defending Encryption in the U.S. and Abroad: 2025 in Review

Sun, 12/28/2025 - 12:24pm

Defending encryption has long been a bedrock of our work. Without encryption, it's impossible to have private conversations or private data storage. This year, we’ve seen attacks on these rights from all around the world. 

Europe Goes All in On Breaking Encryption, Mostly Fails (For Now)

The European Union Council has repeatedly tried to pass a controversial message scanning proposal, known as “Chat Control,” that would require secure messaging providers to scan the contents of messages. Every time this has come up since it was first introduced in 2022, it got batted down—because no matter how you slice it, client-side scanning breaks end-to-end encryption. The Danish presidency seemed poised to succeed in passing Chat Control this year, but strong pushback from across the EU caused them to reconsider and rework their stance. In its current state, Chat Control isn’t perfect, but it at least includes strong language to protect encryption, which is good news for users. 

Meanwhile, France tried to pass its own encryption-breaking legislation. Unlike Chat Control, which pushed for client-side scanning, France took a different approach: allowing so-called “ghost participants,” where law enforcement could silently join encrypted chats. Thankfully, the French National Assembly did the right thing and rejected this dangerous proposal

It wasn’t all wins, though.

Perhaps the most concerning encryption issue is still ongoing in the United Kingdom, where the British government reportedly ordered Apple to backdoor its optional end-to-end encryption in iCloud. In response, Apple disabled one of its strongest security features, Advanced Data Protection, for U.K. users. After some back and forth with the U.S., the U.K. allegedly rewrote the demand, to clarify it was limited to only apply to British users. That doesn’t make it any better. Tribunal hearings are planned for 2026, and we’ll continue to monitor developments.

Speaking of developments to keep an eye on, the European Commission released its “Technology Roadmap on Encryption” which discusses new ways for law enforcement to access encrypted data. There’s a lot that could happen with this roadmap, but let’s be clear, here: EU officials should scrap any roadmap focused on encryption circumvention and instead invest in stronger, more widespread use of end-to-end encryption. 

U.S. Attempts Fall Flat

The U.S. had its share of battles, too. The Senate re-introduced the STOP CSAM Act, which threatened to compromise encryption by requiring encrypted communication providers to have knowledge about what sorts of content their services are being used to send. The bill allows encrypted services to raise a legal defense—but only after they’ve been sued. That's not good enough. STOP CSAM would force encryption providers to defend against costly lawsuits over content they can't see or control. And a jury could still consider the use of encryption to be evidence of wrongdoing. 

In Florida, a bill ostensibly about minors' social media use also just so happened to demand a backdoor into encryption services—already an incredible overreach. It went further, attempting to ban disappearing messages and grant parents unrestricted access to their kids’ messages as well. Thankfully, the Florida Legislature ended without passing it.

It is unlikely these sorts of attempts to undermine encryption will suddenly stop. But whatever comes next, EFF will continue to stand up for everyone's right to use encryption to have secure and private online communications. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

EFF’s ‘How to Fix the Internet’ Podcast: 2025 in Review

Wed, 12/24/2025 - 11:45am

2025 was a stellar year for EFF’s award-winning podcast, “How to Fix the Internet,” as our sixth season focused on the tools and technology of freedom. 

It seems like everywhere we turn we see dystopian stories about technology’s impact on our lives and our futures—from tracking-based surveillance capitalism, to street level government surveillance, to the dominance of a few large platforms choking innovation, to the growing efforts by authoritarian governments to control what we see and say—the landscape can feel bleak. Exposing and articulating these problems is important, but so is envisioning and then building solutions. That’s where our podcast comes in. 

EFF's How to Fix the Internet podcast offers a better way forward. Through curious conversations with some of the leading minds in law and technology, EFF Executive Director Cindy Cohn and Activism Director Jason Kelley explore creative solutions to some of today’s biggest tech challenges. Our sixth season, which ran from May through September, featured: 

  • 2025-htfi-kate-b-episode-art.pngDigital Autonomy for Bodily Autonomy” – We all leave digital trails as we navigate the internet—records of what we searched for, what we bought, who we talked to, where we went or want to go in the real world—and those trails usually are owned by the big corporations behind the platforms we use. But what if we valued our digital autonomy the way that we do our bodily autonomy? Digital Defense Fund Director Kate Bertash joined Cindy and Jason to discuss how creativity and community can align to center people in the digital world and make us freer both online and offline. 
  • 2025-htfi-molly-episode.pngLove the Internet Before You Hate On It” – There’s a weird belief out there that tech critics hate technology. But do movie critics hate movies? Do food critics hate food? No! The most effective, insightful critics do what they do because they love something so deeply that they want to see it made even better. Molly White—a researcher, software engineer, and writer who focuses on the cryptocurrency industry, blockchains, web3, and other tech joined Cindy and Jason to discuss working toward a human-centered internet that gives everyone a sense of control and interaction; open to all in the way that Wikipedia was (and still is) for her and so many others: not just as a static knowledge resource, but as something in which we can all participate. 
  • 2025-htfi-isabela-episode.pngWhy Three is Tor's Magic Number” – Many in Silicon Valley, and in U.S. business at large, seem to believe innovation springs only from competition, a race to build the next big thing first, cheaper, better, best. But what if collaboration and community breeds innovation just as well as adversarial competition? Tor Project Executive Director Isabela Fernandes joined Cindy and Jason to discuss the importance of not just accepting technology as it’s given to us, but collaboratively breaking it, tinkering with it, and rebuilding it together until it becomes the technology that we really need to make our world a better place. 
  • 2025-htfi-harlo-episode.pngSecuring Journalism on the ‘Data-Greedy’ Internet” – Public-interest journalism speaks truth to power, so protecting press freedom is part of protecting democracy. But what does it take to digitally secure journalists’ work in an environment where critics, hackers, oppressive regimes, and others seem to have the free press in their crosshairs? Freedom of the Press Foundation Digital Security Director Harlo Holmes joined Cindy and Jason to discuss the tools and techniques that help journalists protect themselves and their sources while keeping the world informed. 
  • 2025-htfi-deirdre-episode.pngCryptography Makes a Post-Quantum Leap” – The cryptography that protects our privacy and security online relies on the fact that even the strongest computers will take essentially forever to do certain tasks, like factoring prime numbers and finding discrete logarithms which are important for RSA encryption, Diffie-Hellman key exchanges, and elliptic curve encryption. But what happens when those problems—and the cryptography they underpin—are no longer infeasible for computers to solve? Will our online defenses collapse? Research and applied cryptographer Deirdre Connolly joined Cindy and Jason to discuss not only how post-quantum cryptography can shore up those existing walls but also help us find entirely new methods of protecting our information. 
  • 2025-htfi-helen-episode.pngFinding the Joy in Digital Security” – Many people approach digital security training with furrowed brows, as an obstacle to overcome. But what if learning to keep your tech safe and secure was consistently playful and fun? People react better to learning and retain more knowledge when they're having a good time. It doesn’t mean the topic isn’t serious—it’s just about intentionally approaching a serious topic with joy. East Africa digital security trainer Helen Andromedon joined Cindy and Jason to discuss making digital security less complicated, more relevant, and more joyful to real users, and encouraging all women and girls to take online safety into their own hands so that they can feel fully present and invested in the digital world. 
  • 2025-htfi-kara-episode.pngSmashing the Tech Oligarchy” – Many of the internet’s thorniest problems can be attributed to the concentration of power in a few corporate hands: the surveillance capitalism that makes it profitable to invade our privacy, the lack of algorithmic transparency that turns artificial intelligence and other tech into impenetrable black boxes, the rent-seeking behavior that seeks to monopolize and mega-monetize an existing market instead of creating new products or markets, and much more. Tech journalist and critic Kara Swisher joined Cindy and Jason to discuss regulation that can keep people safe online without stifling innovation, creating an internet that’s transparent and beneficial for all, not just a collection of fiefdoms run by a handful of homogenous oligarchs. 
  • 2025-htfi-arvind-episode.jpgSeparating AI Hope from AI Hype” – If you believe the hype, artificial intelligence will soon take all our jobs, or solve all our problems, or destroy all boundaries between reality and lies, or help us live forever, or take over the world and exterminate humanity. That’s a pretty wide spectrum, and leaves a lot of people very confused about what exactly AI can and can’t do. Princeton Professor and “AI Snake Oil” publisher Arvind Narayanan joined Cindy and Jason to discuss how we get to a world in which AI can improve aspects of our lives from education to transportation—if we make some system improvements first—and how AI will likely work in ways that we barely notice but that help us grow and thrive. 
  • 2025-htfi-neuro-episode.jpgProtecting Privacy in Your Brain” – Rapidly advancing "neurotechnology" could offer new ways for people with brain trauma or degenerative diseases to communicate, as the New York Times reported this month, but it also could open the door to abusing the privacy of the most personal data of all: our thoughts. Worse yet, it could allow manipulating how people perceive and process reality, as well as their responses to it—a Pandora’s box of epic proportions. Neuroscientist Rafael Yuste and human rights lawyer Jared Genser, co-founders of The Neurorights Foundation, joined Cindy and Jason to discuss how technology is advancing our understanding of what it means to be human, and the solid legal guardrails they're building to protect the privacy of the mind. 
  • 2025-htfi-brewster-episode.jpgBuilding and Preserving the Library of Everything” – Access to knowledge not only creates an informed populace that democracy requires but also gives people the tools they need to thrive. And the internet has radically expanded access to knowledge in ways that earlier generations could only have dreamed of—so long as that knowledge is allowed to flow freely. Internet Archive founder and digital librarian Brewster Kahle joined Cindy and Jason to discuss how the free flow of knowledge makes all of us more free.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Politicians Rushed Through An Online Speech “Solution.” Victims Deserve Better.

Wed, 12/24/2025 - 11:44am

Earlier this year, both chambers of Congress passed the TAKE IT DOWN Act. This bill, while well-intentioned, gives powerful people a new legal tool to force online platforms to remove lawful speech that they simply don't like. 

The bill, sponsored by Senate Commerce Chair Ted Cruz (R-TX) and Rep. Maria Salazar (R-FL), sought to speed up the removal of troubling online content: non-consensual intimate imagery (NCII). The spread of NCII is a serious problem, as is digitally altered NCII, sometimes called “deepfakes.” That’s why 48 states have specific laws criminalizing the distribution of NCII, in addition to the long-existing defamation, harassment, and extortion statutes—all of which can be brought to bear against those who abuse NCII. Congress can and should protect victims of NCII by enforcing and improving these laws. 

Unfortunately, TAKE IT DOWN takes another approach: it creates an unneeded notice-and-takedown system that threatens free expression, user privacy, and due process, without meaningfully addressing the problem it seeks to solve. 

While Congress was still debating the bill, EFF, along with the Center for Democracy & Technology (CDT), Authors Guild, Demand Progress Action, Fight for the Future, Freedom of the Press Foundation, New America’s Open Technology Institute, Public Knowledge, Restore The Fourth, SIECUS: Sex Ed for Social Change, TechFreedom, and Woodhull Freedom Foundation, sent a letter to the Senate outlining our concerns with the proposal. 

First, TAKE IT DOWN’s removal provision applies to a much broader category of content—potentially any images involving intimate or sexual content—than the narrower NCII definitions found elsewhere in the law. We worry that bad-faith actors will use the law’s expansive definition to remove lawful speech that is not NCII and may not even contain sexual content. 

Worse, the law contains no protections against frivolous or bad-faith takedown requests. Lawful content—including satire, journalism, and political speech—could be wrongly censored. The law requires that apps and websites remove content within 48 hours or face significant legal risks. That ultra-tight deadline means that small apps or websites will have to comply so quickly to avoid legal risk, that they won’t be able to investigate or verify claims. 

Finally, there are no legal protections for providers when they believe a takedown request was sent in bad faith to target lawful speech. TAKE IT DOWN is a one-way censorship ratchet, and its fast timeline discourages providers from standing up for their users’ free speech rights. 

This new law could lead to the use of automated filters that tend to flag legal content, from commentary to news reporting. Communications providers that offer users end-to-end encrypted messaging, meanwhile, may be served with notices they simply cannot comply with, given the fact that these providers can’t view the contents of messages on their platforms. Platforms could respond by abandoning encryption entirely in order to be able to monitor content, turning private conversations into surveilled spaces.

We asked for several changes to protect legitimate speech that is not NCII, and to include common-sense safeguards for encryption. Thousands of EFF members joined us by writing similar messages to their Senators and Representatives. That resulted in several attempts to offer common-sense amendments during the Committee process. 

However, Congress passed the bill without those needed changes, and it was signed into law in May 2025. The main takedown provisions of the bill will take effect in 2026. We’ll be pushing online platforms to be transparent about the content they take down because of this law, and will be on the watch for takedowns that overreach and censor lawful speech. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

How to Sustain Privacy & Free Speech

Tue, 12/23/2025 - 1:06pm

The world has been forced to bear the weight of billionaires and politicians who salivate over making tech more invasive, more controlling, and more hostile. That's why EFF’s mission for your digital rights is crucial, and why your support matters more than ever. You can fuel the fight for privacy and free speech with as little as $5 or $10 a month:

Join EFF

Become a Monthly Sustaining Donor

When you donate by December 31, your monthly support goes even further by unlocking bonus Year-End Challenge grants! With your help, EFF can receive up to seven grants that increase in size as the number of supporters grows (check our progress on the counter). Many thanks to EFF’s Board of Directors for creating the 2025 challenge fund.

The EFF team makes every dollar count. EFF members giving just $10 or less each month raised $400,000 for digital rights in the last year. That funds court motions, software development, educational campaigns, and investigations for the public good every day. EFF member support matters, and we need you.

📣 Stand Together: That’s How We Win 📣

You can help EFF hold corporations and authoritarians to account. We fight for tech users in the courts and we lobby and educate lawmakers, all while developing free privacy-enhancing tech and educational resources so people can protect themselves now. Your monthly donation will keep us going strong in this pivotal moment.

Get your choice of free gear when you join EFF!

Your privacy online and the right to express yourself are powerful—and it’s the reason authoritarians work so viciously to take them away. But together, we can make sure technology remains a tool for the people. Become a monthly Sustaining Donor or give a one-time donation of any size by December 31 and unlock additional Year-End Challenge grants!

Give Today

Unlock Year-End Challenge Grants

Already an EFF Member? Help Us Spread the Word!

EFF Members have carried the movement for privacy and free expression for decades. You can help move the mission even further! Here’s some sample language that you can share with your networks:


We need to stand together and ensure technology works for us, not against us. Donate any amount to EFF by Dec 31, and you'll help unlock challenge grants! https://eff.org/yec
Bluesky Facebook | LinkedIn | Mastodon
(more at eff.org/social)

_________________

EFF is a member-supported U.S. 501(c)(3) organization. We’re celebrating TWELVE YEARS of top ratings from the nonprofit watchdog Charity Navigator! Your donation is tax-deductible as allowed by law.

AI Police Reports: Year In Review

Tue, 12/23/2025 - 12:00pm

In 2024, EFF wrote our initial blog about what could go wrong when police let AI write police reports. Since then, the technology has proliferated at a disturbing rate. Why? The most popular generative AI tool for writing police reports is Axon’s Draft One, and Axon also happens to be the largest provider of body-worn cameras to police departments in the United States. As we’ve written, companies are increasingly bundling their products to make it easier for police to buy more technology than they may need or that the public feels comfortable with. 

We have good news and bad news. 

Here’s the bad news: AI written police reports are still unproven, untransparent, and downright irresponsible–especially when the criminal justice system, informed by police reports, is deciding people’s freedom. The King County prosecuting attorney’s office in Washington state barred police from using AI to write police reports. As their memo read, “We do not fear advances in technology – but we do have legitimate concerns about some of the products on the market now... AI continues to develop and we are hopeful that we will reach a point in the near future where these reports can be relied on. For now, our office has made the decision not to accept any police narratives that were produced with the assistance of AI.” 

In July of this year, EFF published a two-part report on how Axon designed Draft One to defy transparency. Police upload their body-worn camera’s audio into the system, the system generates a report that the officer is expected to edit, and then the officer exports the report. But when they do that, Draft One erases the initial draft, and with it any evidence of what portions of the report were written by AI and what portions were written by an officer. That means that if an officer is caught lying on the stand – as shown by a contradiction between their courtroom testimony and their earlier police report – they could point to the contradictory parts of their report and say, “the AI wrote that.” Draft One is designed to make it hard to disprove that. 

In this video of a roundtable discussion about Draft One, Axon’s senior principal product manager for generative AI is asked (at the 49:47 mark) whether or not it’s possible to see after-the-fact which parts of the report were suggested by the AI and which were edited by the officer. His response (bold and definition of RMS added): 

So we don’t store the original draft and that’s by design and that’s really because the last thing we want to do is create more disclosure headaches for our customers and our attorney’s offices—so basically the officer generates that draft, they make their edits, if they submit it into our Axon records system then that’s the only place we store it, if they copy and paste it into their third-party RMS [records management system] system as soon as they’re done with that and close their browser tab, it’s gone. It’s actually never stored in the cloud at all so you don’t have to worry about extra copies floating around.”

Yikes! 

All of this obfuscation also makes it incredibly hard for people outside police departments to figure out if their city’s officers are using AI to write reports–and even harder to use public records requests to audit just those reports. That’s why this year EFF also put out a comprehensive guide to help the public make their records requests as tailored as possible to learn about AI-generated reports. 

Ok, now here’s the good news: People who believe AI-written police reports are irresponsible and potentially harmful to the public are fighting back. 

This year, two states have passed bills that are an important first step in reigning in AI police reports. Utah’s SB 180 mandates that police reports created in whole or in part by generative AI have a disclaimer that the report contains content generated by AI. It also requires officers to certify that they checked the report for accuracy. California’s SB 524 went even further. It requires police to disclose, on the report, if it was used to fully or in part author a police report. Further, it bans vendors from selling or sharing the information a police agency provided to the AI. The bill also requires departments to retain the first draft of the report so that judges, defense attorneys, or auditors could readily see which portions of the final report were written by the officer and which portions were written by the computer.

In the coming year, anticipate many more states joining California and Utah in regulating, or perhaps even banning, police from using AI to write their reports. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

The Fight Against Presidential Targeting of Law Firms: 2025 in Review

Tue, 12/23/2025 - 11:51am

The US legal profession was just one of the pillars of American democracy that was targeted in the early days of the second Trump administration. At EFF, we were proud to publicly and loudly support the legal profession and, most importantly, continue to do our work challenging the government’s erosion of digital rights—work that became even more critical as many law firms shied away from pro bono work.

For those that don’t know: pro bono work is work that for-profit law firms undertake for the public good. This usually means providing legal counsel to clients who desperately need but cannot afford it. It’s a vital practice, since non-profits like EFF don’t have the same capacity, resources, or expertise of a classic white shoe law firm. It’s mutually beneficial, actually, since law firms and non-profits have different experience and areas of expertise that can supplement each other’s work.

A little more than a month into the new administration, President Trump began retaliating against large law firms who supported had investigations against him or litigated against his interests, representing clients either challenging his policies during his first term or defending the outcome of the 2020 election among other cases. The retaliation quickly spread to other firmsfirms lost government contracts and had security clearances stripped from their lawyers. Twenty large law firm were threatened by the Equal Employment Opportunity Commission over their DEI policies. Individual lawyers were also targeted. The policy attacking the legal profession was memorialized as official policy in the March 22, 2025 presidential memo Preventing Abuses of the Legal System and the Federal Court.

Although many of the targeted firms shockingly and regrettably capitulated, a few law firms sued to undo the actions against them. EFF was eager to support them, joining amicus briefs in each case. Over 500 law firms across the country joined supportive amicus briefs as well.

We also thought it critically important to publicly state our support for the targeted law firms and to call out the administration’s actions as violating the rule of law. So we did. We actually expected numerous law firms and legal organizations to also issue statements. But no one else did. EFF was thus the very first non-targeted legal organization in the country, either law firm or nonprofit, to publicly oppose the administration’s attack on the independence of the legal profession. Fortunately, within the week, firms started to speak up as well. As did the American Bar Association.

In the meantime, EFF’s legal work has become even more critical as law firms have reportedly pulled back on their pro bono hours since the administration’s attacks. Indeed, recognizing the extraordinary need, we ramped up out litigation, including cases against the federal government, suing DOGE for stealing Americans’ data, the state department for chilling visa-holders’ speech by surveilling and threatening to surveil their social media posts, and seeking records of the administration’s demands to online platforms to remove ICE oversight apps.

And we’re going to keep on going in 2026 and beyond.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

2025 in Review

Tue, 12/23/2025 - 11:50am

Each December we take a moment to publish a series of blog posts that look back at the things we’ve accomplishes in fighting for your rights and privacy the past 12 months. But this year I’ve been thinking not just about the past 12 months, but also over the past 25 years I’ve spent at EFF.  As many folks know, I’ve decided to pass the leadership torch and will leave EFF in 2026, so this will be the last time I write one of these annual reviews.  It’s bittersweet, but I’m filled with pride, especially about how we stick with fights over the long run.  

EFF has come a long way since I joined in 2000.  In so many ways, the work and reputation we have built laid the groundwork for years like 2025 – when freedom, justice and innovation were under attack from many directions at once with tech unfortunately at the center of many of them.  As a result, we launched our Take Back CRTL campaign to put the focus on fighting back. 

In addition to the specific issues we address in this year-end series of blog posts, EFF brought our legal expertise to several challenges to the Trump Administration’s attacks on privacy, free speech and security, including directly bringing two cases against the government and filing multiple amicus briefs in others.  In some ways, that’s not new: we’ve worked in the courts to hold the government accountable all the way back to our founding in1990.  

In this introductory blog post, however, I want to highlight two topics that attest to our long history of advocacy.  The first is our battle against the censorship and privacy nightmares that come from requirements that internet users to submit to  age verification. We’ve long known that age verification technologies, which aim to block young people from viewing or sharing information that the government deems “harmful” or “offensive,” end up becoming  tools of censorship.  They often rely on facial recognition and other techniques that have unacceptable levels of inaccuracy and that create security risks.  Ultimately, they are surveillance systems that chill access to vital online communities and resources, and burden the expressive rights of adults and young people alike. 

The second is automated license plate readers (ALPR), which serve as a mass surveillance network of our locations as we go about our day.  We sued over this technology in 2013, demanding public access to records about their use and ultimately won at the California Supreme Court.  But 2025 is the year that the general public began to understand just how much information is being collected and used by governments and private entities alike, and to recognize the dangers that causes. Our investigations team filed another public records requests, revealing racist searches done by police. And 12 years later after our first lawsuit, our lawyers filed another case, this time directly challenging the ALPR policies of San Jose, California.  In addition, our activists have been working with people in municipalities across the country who want to stop their city’s use of ALPR in their communities.  Groups in Austin, Texas, for example, worked hard to get their city to reject a new contract for these cameras.  

These are just two issues of many that have engaged our lawyers, activists, and technologists this year. But they show how we dig in for the long run and are ready when small issues become bigger ones.   

The more than 100 people who work at EFF spent this last year proving their mettle in battles, many of which are nowhere near finished. But we will push on, and when those issues breach public consciousness, we’ll be ready.    

We can only keep doggedly working on these issues year after year because of you, our members and supporters. You engage on these issues, you tell us when something is happening in your town, and your donations power everything we do. This may be my last end-of-the-year blog post, but thanks to you, EFF is here to stay.  We’re strong, we’re ready, and we know how to stick with things for the long run. Thanks for holding us up.    

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Online Gaming’s Final Boss: The Copyright Bully

Fri, 12/19/2025 - 1:14pm

Since earliest days of computer games, people have tinkered with the software to customize their own experiences or share their vision with others. From the dad who changed the game’s male protagonist to a girl so his daughter could see herself in it, to the developers who got their start in modding, games have been a medium where you don’t just consume a product, you participate and interact with culture.

For decades, that participatory experience was a key part of one of the longest-running video games still in operation: Everquest. Players had the official client, acquired lawfully from EverQuest’s developers, and modders figured out how to enable those clients to communicate with their own servers and then modify their play experience – creating new communities along the way.

Everquest’s copyright owners implicitly blessed all this. But the current owners, a private equity firm called Daybreak, want to end that independent creativity. They are using copyright claims to threaten modders who wanted to customize the EverQuest experience to suit a different playstyle, running their own servers where things worked the way they wanted. 

One project in particular is in Daybreak’s crosshairs: “The Hero’s Journey” (THJ). Daybreak claims THJ has infringed its copyrights in Everquest visuals and character, cutting into its bottom line.

Ordinarily, when a company wants to remedy some actual harm, its lawyers will start with a cease-and-desist letter and potentially pursue a settlement. But if the goal is intimidation, a rightsholder is free to go directly to federal court and file a complaint. That’s exactly what Daybreak did, using that shock-and-awe approach to cow not only The Hero’s Journey team, but unrelated modders as well.

Daybreak’s complaint seems to have dazzled the judge in the case by presenting side-by-side images of dragons and characters that look identical in the base game and when using the mod, without explaining that these images are the ones provided by EverQuest’s official client, which players have lawfully downloaded from the official source. The judge wound up short-cutting the copyright analysis and issuing a ruling that has proven devastating to the thousands of players who are part of EverQuest modding communities.

Daybreak and the developers of The Hero’s Journey are now in private arbitration, and Daybreak has wasted no time in sending that initial ruling to other modders. The order doesn’t bind anyone who’s unaffiliated with The Hero’s Journey, but it’s understandable that modders who are in it for fun and community would cave to the implied threat that they could be next.

As a result, dozens of fan servers have stopped operating. Daybreak has also persuaded the maintainers of the shared server emulation software that most fan servers rely upon, EQEmulator, to adopt terms of service that essentially ban any but the most negligible modding. The terms also provide that “your operation of an EQEmulator server is subject to Daybreak’s permission, which it may revoke for any reason or no reason at any time, without any liability to you or any other person or entity. You agree to fully and immediately comply with any demand from Daybreak to modify, restrict, or shut down any EQEmulator server.” 

This is sadly not even an uncommon story in fanspaces—from the dustup over changes to the Dungeons and Dragons open gaming license to the “guidelines” issued by CBS for Star Trek fan films, we see new generations of owners deciding to alienate their most avid fans in exchange for more control over their new property. It often seems counterintuitive—fans are creating new experiences, for free, that encourage others to get interested in the original work.

Daybreak can claim a shameful victory: it has imposed unilateral terms on the modding community that are far more restrictive than what fair use and other user rights would allow. In the process, it is alienating the very people it should want to cultivate as customers: hardcore Everquest fans. If it wants fans to continue to invest in making its games appeal to broader audiences and serve as testbeds for game development and sources of goodwill, it needs to give the game’s fans room to breathe and to play.

If you’ve been a target of Daybreak’s legal bullying, we’d love to hear from you; email us at info@eff.org.

Speaking Freely: Sami Ben Gharbia

Fri, 12/19/2025 - 12:28pm

Interviewer: Jillian York

Sami Ben Gharbia is a Tunisian human rights campaigner, blogger, writer and freedom of expression advocate. He founded Global Voices Advocacy, and is the co-founder and current publisher of the collective media organization Nawaat, which won the EFF Award in 2011

Jillian York: So first, what is your personal definition, or how do you conceptualize freedom of expression?

Sami Ben Gharbia: So for me, freedom of expression, it is mainly as a human. Like, I love the definition of Arab philosophers to human beings, we call it “speaking animal”. So that's the definition in logic, like the science of logic, meditated on by the Greeks, and that defines a human being as a speaking animal, which means later on. Descartes, the French philosopher, describes it like the Ergo: I think, so I am. So the act of speaking is an act of thinking, and it's what makes us human. So this is my definition that I love about freedom of expression, because it's the condition, the bottom line of our human being. 

JY: I love that. Is that something that you learned about growing up?

SBG: You mean, like, reading it or living?

JY: Yeah, how did you come to this knowledge?

SBG: I read a little bit of logics, like science of logic, and this is the definition that the Arabs give to define what is a human being; to differentiate us from, from plants or animals, or, I don't know, rocks, et cetera. So the humans are speaking, animals, 

JY: Oh, that's beautiful. 

SBG: And by speaking, it's in the Arabic definition of the word speaking, it's thinking. It's equal to thinking. 

JY: At what point, growing up, did you realize…what was the turning point for you growing up in Tunisia and realizing that protecting freedom of expression was important?

SBG: Oh, I think, I was born in 1967 and I grew up under an authoritarian regime of the “father” of this Tunisian nation, Bourghiba, the first president of Tunisia, who got us independence from France. And during the 80s, it was very hard to find even books that speak about philosophy, ideology, nationalism, Islamism, Marxism, etc. So to us, almost everything was forbidden. So you need to hide the books that you smuggle from France or from libraries from other cities, et cetera. You always hide what you are reading because you do not want to expose your identity, like you are someone who is politically engaged or an activist. So, from that point, I realized how important freedom of expression is, because if you are not allowed even to read or to buy or to exchange books that are deemed to be controversial or are so politically unacceptable under an authoritarian regime, that's where the fight for freedom of expression should be at the forefront of of any other fights. That's the fight that we need to engage in in order to secure other rights and freedoms.

JY: You speak a number of languages, at what point did you start reading and exploring other languages than the one that you grew up speaking?

SBG: Oh, I think, well, we learn Arabic, French and English in school, and like, primary school, secondary school, so these are our languages that we take from school and from our readings, etc, and interaction with other people in Tunisia. But my first experience living in a country that speaks another language that I didn't know was in Iran. So I spent, in total, one and a half years there in Iran, where I started to learn a fourth language that I really intended to use. It's not a Latin language. It is a special language, although they use almost the same letters and alphabet with some difference in pronunciation and writing, but but it was easy for an Arab speaking native Tunisian to learn Farsi due to the familiarity with the alphabets and familiarity with the pronunciation of most of the alphabet itself. So, that's the first case where I was confronted with a foreign language. It was Iran. And then during my exile in the Netherlands, I was confronted by another family of languages, which is Dutch from the family of Germanic languages, and that's the fifth language that I learned in the Netherlands. 

JY: Wow. And how do you feel that language relates to expression? For you?

SBG: I mean…language, it's another word. It's another universe. Because language carries culture, carries knowledge, carries history, customs. So it's a universe that is living. And once you learn to speak a new language, actually, you embrace another culture. You are more open in the way of understanding and accepting differences between other cultures, and I think that's how it makes your openness much more elastic. Like you accept other cultures more, other identities, and then you are not afraid anymore. You're not scared anymore from other identities, let's say, because I think the problem of civilization and crisis or conflict starts from ignorance—like we don't know the others, we don't know the language, we don't know the customs, the culture, the heritage, the history. That's why we are scared of other people. So the language is the first, let's say, window to other identity and acceptance of other people

JY: And how many languages do you speak now?

SBG: Oh, well, I don't know. Five for sure, but since I moved to exile a second time now, to Spain, I started learning Spanish, and I've been traveling a lot in Italy, started learning some some Italian, but it is confusing, because both are Latin languages, and they share a lot of words, and so it is confusing, but it is funny. I'm not that young to learn quickly, but I'm 58 years old, so it's not easy for someone my age to learn a new language quickly, especially when you are confused about languages from the same family as Latin.

JY: Oh, that's beautiful, though. I love that. All right, now I want to dig into the history of [2011 EFF Award winner] Nawaat. How did it start?

SBG: So Nawaat started as a forum, like in the early 2000s, even before the phenomena of blogs. Blogs started later on, maybe 2003-4, when they became the main tools for expression. Before that, we had forums where people debate ideas, anything. So it started as a forum, multiple forums hosted on the same domain name, which is Nawaat.org and little by little, we adopted new technology. We moved it. We migrated the database from from the forum to CMS, built a new website, and then we started building the website or the blog as a collective blog where people can express themselves freely, and in a political context where, similar to many other countries, a lot of people express themselves through online platforms because they are not allowed to express themselves freely through television or radio or newspaper or magazines in in their own country. 

So it started mainly as an exiled media. It wasn't journalistically oriented or rooted in journalism. It was more of a platform to give voices to the diaspora, mainly the exiled Tunisian diaspora living in exile in France and in England and elsewhere. So we published Human Rights Reports, released news about the situation in Tunisia. We supported the opposition in Tunisia. We produced videos to counter the propaganda machine of the former President Ben Ali, etc. So that's how it started and evolved little by little through the changing in the tech industry, from forums to blogs and then to CMS, and then later on to to adopt social media accounts and pages. So this is how it started and why we created it that like that was not my decision. It was a friend of mine, we were living in exile, and then we said, “why not start a new platform to support the opposition and this movement in Tunisia?” And that's how we did it at first, it was fun, like it was something like it was a hobby. It wasn't our work. I was working somewhere else, and he was working something else. It was our, let's say hobby or pastime. And little by little, it became our, our only job, actually.

JY: And then, okay, so let's come to 2011. I want to hear now your perspective 14 years later. What role do you really feel that the internet played in Tunisia in 2011?

SBG: Well, it was a hybrid tool for liberation, etc. We know the context of the internet freedom policy from the US we know, like the evolution of Western interference within the digital sphere to topple governments that are deemed not friendly, etc. So Tunisia was like, a friend of the West, very friendly with France and the United States and Europe. They loved the dictatorship in Tunisia, in a way, because it secured the border. It secured the country from, by then, the Islamist movement, et cetera. So the internet did play a role as a platform to spread information and to highlight the human rights abuses that are taking place in Tunisia and to counter the narrative that is being manipulated then by the government agency, state agency, public broadcast channel, television news agency, etc. 

And I think we managed it like the big impact of the internet and the blogs by then and platforms like now. We adopted English. It was the first time that the Tunisian opposition used English in its discourse, with the objective to bridge the gap between the traditional support for opposition and human rights in Tunisia that was mainly was coming from French NGOs and human rights organization towards international support, and international support that is not only coming from the traditional, usual suspects of Human Rights Watch, Amnesty International, Freedom House, et cetera. Now we wanted to broaden the spectrum of the support and to reach researchers, to reach activists, to reach people who are writing about freedom elsewhere. So we managed to break the traditional chain of support between human rights movements or organizations and human rights activists in Tunisia, and we managed to broaden that and to reach other people, other audiences that were not really touching what was going on in Tunisia, and I think that's how the Internet helped in the field of international support to the struggle in Tunisia and within Tunisia. 

The impact was, I think, important to raise awareness about human rights abuses in the country, so people who are not really politically knowledgeable about the situation due to the censorship and due to the problem of access to information which was lacking in Tunisia, the internet helped spread the knowledge about the situation and help speed the process of the unrest, actually. So I think these are the two most important impacts within the country, to broaden the spectrum of the people who are reached and targeted by the discourse of political engagement and activism, and the second is to speed the process of consciousness and then the action in the street. So this is how I think the internet helped. That's great, but it wasn't the main tool. I mean, the main tool was really people on the ground and maybe people who didn't have access to the internet at all.

JY: That makes sense. So what about the other work that you were doing around that time with the Arabloggers meetings and Global Voices and the Arab Techies network. Tell us about that.

SBG: Okay, so my position was the founding director of Global Voices Advocacy, I was hired to found this, this arm of advocacy within Global Voices. And that gave me the opportunity to understand other spheres, linguistic spheres, cultural spheres. So it was beyond Tunisia, beyond the Arab world and the region. I was in touch with activists from all over the world. I mean by activists, I mean digital activists, bloggers that are living in Latin America or in Asia or in Eastern Europe, et cetera, because one of the projects that I worked on was Threatened Voices, which was a map of all people who were targeted because of their online activities. That gave me the opportunity to get in touch with a lot of activists.

And then we organized the first advocacy meeting. It was in Budapest, and we managed to invite like 40 or 50 activists from all over the world, from China, Hong Kong, Latin America, the Arab world, Eastern Europe, and Africa. And that broadened my understanding of the freedom of expression movement and how technology is being used to foster human rights online, and then the development of blog aggregators in the world, and mainly in the Arab world, like, each country had its own blog aggregator. That helped me understand those worlds, as did Global Voices. Because Global Voices was bridging the gap between what is being written elsewhere, through the translation effort of Global Voices to the English speaking world and vice versa, and the role played by Global Voices and Global Voices Advocacy made the space and the distance between all those blogospheres feel very diminished. We were very close to the blogosphere movement in Egypt or in Morocco or in Syria and elsewhere. 

And that's how, Alaa Abd El Fattah and Manal Bahey El-Din Hassan and myself, we started thinking about how to establish the Arab Techies collective, because the needs that we identified—there was a gap. There was a lack of communication between pure techies, people who are writing code, building software, translating tools and even online language into Arabic, and the people who are using those tools. The bloggers, freedom of expression advocates, et cetera. And because there are some needs that were not really met in terms of technology, we thought that bringing these two words together, techies and activists would help us build new tools, translate new tools, make tools available to the broader internet activists. And that's how the Arab Techies collective was born in Cairo, and then through organizing the Arabloggers meetings two times in Beirut, and then the third in Tunisia, after the revolution. 

It was a momentum for us, because it, I think it was the first time in Beirut that we brought bloggers from all Arab countries, like it was like a dream that was really unimaginable but at a certain point, but we made that happen. And then what they call the Arab revolution happened, and we lost contact with each other, because everybody was really busy with his or her own country's affairs. So Ali was really fully engaged in Egypt myself, I came back to Tunisia and was fully engaged in Tunisia, so we lost contact, because all of us were having a lot of trouble in their own country. A lot of those bloggers, like who attended the Arab bloggers meetings, few of them were arrested, few of them were killed, like Bassel was in prison, people were in exile, so we lost that connection and those conferences that brought us together, but then we've seen SMEX like filling that gap and taking over the work that started by the Arab techies and the Arab bloggers conference.

JY: We did have the fourth one in 2014 in Amman. But it was not the same. Okay, moving forward, EFF recently published this blog post reflecting on what had just happened to Nawaat, when you and I were in Beirut together a few weeks ago. Can you tell me what happened?

SBG: What happened is that they froze the work of Nawaat. Legally, although the move wasn't legal, because for us, we were respecting the law in Tunisia. But they stopped the activity of Nawaat for one month. And this is according to an article from the NGO legal framework, that the government can stop the work of an NGO if the NGO doesn't respect certain legal conditions; for them Nawaat didn't provide enough documentation that was requested by the government, which is a total lie, because we always submit all documentation on time to the government. So they stopped us from doing our job, which is what we call in Tunisia, an associated media. 

It's not a company, it's not a business. It's not a startup. It is an NGO that is managing the website and the media, and now it has other activities, like we have the online website, the main website, but we also have a festival, which is a three day festival in our headquarters. We have offline debates. We bring actors, civil society, activists, politicians, to discuss important issues in Tunisia. We have a quality print magazine that is being distributed and sold in Tunisia. We have an innovation media incubation program where we support people to build projects through journalism and technology. So we have a set of offline projects that stopped for a month, and we also stopped publishing anything on the website and all our social media accounts. And now what? It's not the only one. They also froze the work of other NGOs, like the Tunisian Association of Democratic Women, which is really giving support to women in Tunisia. Also the Tunisian Forum for Social and Economic Rights, which is a very important NGO giving support to grassroots movements in Tunisia. And they stopped Aswat Nissa, another NGO that is giving support to women in Tunisia. So they targeted impactful NGOs. 

So now what? It's not an exception, and we are very grateful to the wave of support that we got from Tunisian fellow citizens, and also friendly NGOs like EFF and others who wrote about the case. So this is the context in which we are living, and we are afraid that they will go for an outright ban of the network in the future. This is the worst case scenario that we are preparing ourselves for, and we might face this fate of seeing it close its doors and stop all offline activities that are taking place in Tunisia. Of course, the website will remain. We need to find a way to keep on producing, although it will really be risky for our on-the-ground journalists and video reporters and newsroom team, but we need to find a solution to keep the website alive. As an exiled media it's a very probable scenario and approach in the future, so we might go back to our exile media model, and we will keep on fighting.

JY: Yes, of course. I'm going to ask the final question. We always ask who someone’s free speech hero is, but I’m going to frame it differently for you, because you're somebody who influenced a lot of the way that I think about these topics. And so who's someone that has inspired you or influenced your work?

SBG: Although I started before the launch of WikiLeaks, for me Julian Assange was the concretization of the radical transparency movement that we saw. And for me, he is one of the heroes that really shaped a decade of transparency journalism and impacted not only the journalism industry itself, like even the established and mainstream media, such as the New York Times, Washington Post, Der Spiegel, et cetera. Wikileaks partnered with big media, but not only with big media, also with small, independent newsrooms in the Global South. So for me, Julian Assange is an icon that we shouldn't forget. And he is an inspiration in the way he uses technology to to fight against big tech and state and spy agencies and war crimes.

Fair Use is a Right. Ignoring It Has Consequences.

Thu, 12/18/2025 - 3:54pm

Fair use is not just an excuse to copy—it’s a pillar of online speech protection, and disregarding it in order to lash out at a critic should have serious consequences. That’s what we told a federal court in Channel 781 News v. Waltham Community Access Corporation, our case fighting copyright abuse on behalf of citizen journalists.

Waltham Community Access Corporation (WCAC), a public access cable station in Waltham, Massachusetts, records city council meetings on video. Channel 781 News (Channel 781), a group of volunteers who report on the city council, curates clips from those recordings for its YouTube channel, along with original programming, to spark debate on issues like housing and transportation. WCAC sent a series of takedown notices under the Digital Millennium Copyright Act (DMCA), accusing Channel 781 of copyright infringement. That led to YouTube deactivating Channel 781’s channel just days before a critical municipal election. Represented by EFF and the law firm Brown Rudnick LLP, Channel 781 sued WCAC for misrepresentations in its takedown notices under an important but underutilized provision of the DMCA.

The DMCA gives copyright holders a powerful tool to take down other people’s content from platforms like YouTube. The “notice and takedown” process requires only an email, or filling out a web form, in order to accuse another user of copyright infringement and have their content taken down. And multiple notices typically lead to the target’s account being suspended, because doing so helps the platform avoid liability. There’s no court or referee involved, so anyone can bring an accusation and get a nearly instantaneous takedown.

Of course, that power invites abuse. Because filing a DMCA infringement notice is so easy, there’s a temptation to use it at the drop of a hat to take down speech that someone doesn’t like. To prevent that, before sending a takedown notice, a copyright holder has to consider whether the use they’re complaining about is a fair use. Specifically, the copyright holder needs to form a “good faith belief” that the use is not “authorized by the law,” such as through fair use.

WCAC didn’t do that. They didn’t like Channel 781 posting short clips from city council meetings recorded by WCAC as a way of educating Waltham voters about their elected officials. So WCAC fired off DMCA takedown notices at many of Channel 781’s clips that were posted on YouTube.

WCAC claims they considered fair use, because a staff member watched a video about it and discussed it internally. But WCAC ignored three of the four fair use factors. WCAC ignored that their videos had no creativity, being nothing more than records of public meetings. They ignored that the clips were short, generally including one or two officials’ comments on a single issue. They ignored that the clips caused WCAC no monetary or other harm, beyond wounded pride. And they ignored facts they already knew, and that are central to the remaining fair use factor: by excerpting and posting the clips with new titles, Channel 781 was putting its own “spin” on the material - in other words, transforming it. All of these facts support fair use.

Instead, WCAC focused only on the fact that the clips they targeted were not altered further or put into a larger program. Looking at just that one aspect of fair use isn’t enough, and changing the fair use inquiry to reach the result they wanted is hardly the way to reach a “good faith belief.”

That’s why we’re asking the court to rule that WCAC’s conduct violated the law and that they should pay damages. Copyright holders need to use the powerful DMCA takedown process with care, and when they don’t, there needs to be consequences.

Stand Together to Protect Democracy

Thu, 12/18/2025 - 11:12am

What a year it’s been. We’ve seen technology unfortunately misused to supercharge the threats facing democracy: dystopian surveillance, attacks on encryption, and government censorship. These aren’t abstract dangers. They’re happening now, to real people, in real time.

EFF’s lawyers, technologists, and activists are pushing back. But we need you in this fight.

JOIN EFF TODAY!

MAKE A YEAR END DONATION—HELP EFF UNLOCK CHALLENGE GRANTS!

If you donate to EFF before the end of 2025, you’ll help fuel the legal battles that defend encryption, the tools that protect privacy, and the advocacy that stops dangerous laws—and you’ll help unlock up to $26,200 in challenge grants. 

📣 Stand Together: That's How We Win 📣

The past year confirmed how urgently we need technologies that protect us, not surveil us. EFF has been in the fight every step of the way, thanks to support from people like you.

Get free gear when you join EFF!

This year alone EFF:

  • Launched a resource hub to help users understand and fight back against age verification laws.
  • Challenged San Jose's unconstitutional license plate reader database in court.
  • Sued demanding answers when ICE spotting apps were mysteriously taken offline.
  • Launched Rayhunter to detect cell site simulators.
  • Pushed back hard against the EU's Chat Proposal that would break encryption for millions.

After 35 years of defending digital freedoms, we know what's at stake: we must protect your ability to speak freely, organize safely, and use technology without surveillance.

We have opportunities to win these fights, and you make each victory possible. Donate to EFF by December 31 and help us unlock additional grants this year!

Already an EFF Member? Help Us Spread the Word!

EFF Members have carried the movement for privacy and free expression for decades. You can help move the mission even further! Here’s some sample language that you can share with your networks:


We need to stand together and ensure technology works for us, not against us. Donate any amount to EFF by Dec 31, and you'll help unlock challenge grants! https://eff.org/yec
Bluesky | Facebook | LinkedIn | Mastodon
(more at eff.org/social)

_________________

EFF is a member-supported U.S. 501(c)(3) organization. We’re celebrating TWELVE YEARS of top ratings from the nonprofit watchdog Charity Navigator! Your donation is tax-deductible as allowed by law.

Local Communities Are Winning Against ALPR Surveillance—Here’s How: 2025 in Review

Wed, 12/17/2025 - 2:28pm

Across ideologically diverse communities, 2025 campaigns against automated license plate reader (ALPR) surveillance kept winning. From Austin, Texas to Cambridge, Massachusetts to Eugene, Oregon, successful campaigns combined three practical elements: a motivated political champion on city council, organized grassroots pressure from affected communities, and technical assistance at critical decision moments.

The 2025 Formula for Refusal

  • Institutional Authority: Council members leveraging "procurement power"—local democracy's most underutilized tool—to say no. 
  • Community Mobilization: A base that refuses to debate "better policy" and demands "no cameras." 
  • Shared Intelligence: Local coalitions utilizing shared research on contract timelines and vendor breaches.
Practical Wins Over Perfect Policies

In 2025, organizers embraced the "ugly" win: prioritizing immediate contract cancellations over the "political purity" of perfect privacy laws. Procurement fights are often messy, bureaucratic battles rather than high-minded legislative debates, but they stop surveillance where it starts—at the checkbook. In Austin, more than 30 community groups built a coalition that forced a contract cancellation, achieving via purchasing power what policy reform often delays. 

In Hays County, Texas, the victory wasn't about a new law, but a contract termination. Commissioner Michelle Cohen grounded her vote in vendor accountability, explaining: "It's more about the company's practices versus the technology." These victories might lack the permanence of a statute, but every camera turned off built a culture of refusal that made the next rejection easier. This was the organizing principle: take the practical win and build on it.

Start with the Harm

Winning campaigns didn't debate technical specifications or abstract privacy principles. They started with documented harms that surveillance enabled. EFF's research showing police used Flock's network to track Romani people with discriminatory search terms, surveil women seeking abortion care, and monitor protesters exercising First Amendment rights became the evidence organizers used to build power.

In Olympia, Washington, nearly 200 community members attended a counter-information rally outside city hall on Dec. 2. The DeFlock Olympia movement countered police department claims point-by-point with detailed citations about data breaches and discriminatory policing. By Dec. 3, cameras had been covered pending removal.

In Cambridge, the city council voted unanimously in October to pause Flock cameras after residents, the ACLU of Massachusetts, and Digital Fourth raised concerns. When Flock later installed two cameras "without the city's awareness," a city spokesperson  called it a "material breach of our trust" and terminated the contract entirely. The unexpected camera installation itself became an organizing moment.

The Inside-Outside Game

The winning formula worked because it aligned different actors around refusing vehicular mass surveillance systems without requiring everyone to become experts. Community members organized neighbors and testified at hearings, creating political conditions where elected officials could refuse surveillance and survive politically. Council champions used their institutional authority to exercise "procurement power": the ability to categorically refuse surveillance technology.

To fuel these fights, organizers leveraged technical assets like investigation guides and contract timeline analysis. This technical capacity allowed community members to lead effectively without needing to become policy experts. In Eugene and Springfield, Oregon, Eyes Off Eugene organized sustained opposition over months while providing city council members political cover to refuse. "This is [a] very wonderful and exciting victory," organizer Kamryn Stringfield said. "This only happened due to the organized campaign led by Eyes Off Eugene and other local groups."

Refusal Crosses Political Divides

A common misconception collapsed in 2025: that surveillance technology can only be resisted in progressive jurisdictions. San Marcos, Texas let its contract lapse after a 3-3 deadlock, with Council Member Amanda Rodriguez questioning whether the system showed "return on investment." Hays County commissioners in Texas voted to terminate. Small towns like Gig Harbor, Washington rejected proposals before deployment. 

As community partners like the Rural Privacy Coalition emphasize, "privacy is a rural value." These victories came from communities with different political cultures but shared recognition that mass surveillance systems weren't worth the cost or risk regardless of zip code.

Communities Learning From Each Other

In 2025, communities no longer needed to build expertise from scratch—they could access shared investigation guides, learn from victories in neighboring jurisdictions, and connect with organizers who had won similar fights. When Austin canceled its contract, it inspired organizing across Texas. When Illinois Secretary of State's audit revealed illegal data sharing with federal immigration enforcement, Evanston used those findings to terminate 19 cameras.

The combination of different forms of power—institutional authority, community mobilization, and shared intelligence—was a defining feature of this year's most effective campaigns. By bringing these elements together, community coalitions have secured cancellations or rejections in nearly two dozen jurisdictions since February, building the infrastructure to make the next refusal easier and the movement unstoppable.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.

Pages