Feed aggregator
EFF in the Press: 2025 in Review
EFF’s attorneys, activists, and technologists don’t just do the hard, endless work of defending our digital civil liberties — they also spend a lot of time and effort explaining that work to the public via media interviews.
EFF had thousands of media mentions in 2025, from the smallest hyperlocal outlets to international news behemoths. Our work on street-level surveillance — the technology that police use to spy on our communities — generated a great deal of press attention, particularly regarding automated license plate readers (ALPRs). But we also got a lot of ink and airtime for our three lawsuits against the federal government: one challenging the U.S. Office of Personnel Management's illegal data sharing, a second challenging the State Department's unconstitutional "catch and revoke" program, and the third demanding that the departments of State and Justice reveal what pressure they put on app stores to remove ICE-tracking apps.
Other hot media topics included how travelers can protect themselves against searches of their devices, how protestors can protect themselves from surveillance, and the misguided age-verification laws that are proliferating across the nation and around the world, which are an attack on privacy and free expression.
On national television, Matthew Guariglia spoke with NBC Nightly News to discuss how more and more police agencies are using private doorbell cameras to surveil neighborhoods. Tori Noble spoke with ABC’s Good Morning America about the dangers of digital price tags, as well as with ABC News Live Prime about privacy concerns over OpenAI’s new web browser.
%3Ciframe%20width%3D%22560%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2FUrFD-JVHmp4%3Fsi%3DuaW-nxW8o3jt5WJU%26autoplay%3D1%26mute%3D1%22%20title%3D%22YouTube%20video%20player%22%20frameborder%3D%220%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%3B%20web-share%22%20referrerpolicy%3D%22strict-origin-when-cross-origin%22%20allowfullscreen%3D%22%22%3E%3C%2Fiframe%3E
Privacy info.
This embed will serve content from youtube.com
%3Ciframe%20width%3D%22560%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2F1hEgPLRmgxo%3Fsi%3DwQsxQIqqjUnSI9qm%26autoplay%3D1%26mute%3D1%22%20title%3D%22YouTube%20video%20player%22%20frameborder%3D%220%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%3B%20web-share%22%20referrerpolicy%3D%22strict-origin-when-cross-origin%22%20allowfullscreen%3D%22%22%3E%3C%2Fiframe%3E
Privacy info.
This embed will serve content from youtube.com
In a sampling of mainstream national media, EFF was cited 33 times by the Washington Post, 16 times by CNN, 13 times by USA Today, 12 times by the Associated Press, 11 times by NBC News, 11 times by the New York Times, 10 times by Reuters, and eight times by National Public Radio. Among tech and legal media, EFF was cited 74 times by Privacy Daily, 35 times by The Verge, 32 times by 404 Media, 32 times by The Register, 26 times by Ars Technica, 25 times by WIRED, 21 times by Law360, 21 times by TechCrunch, 20 times by Gizmodo, and 14 times by Bloomberg Law.
Abroad, EFF was cited in coverage by media outlets in nations including Australia, Bangladesh, Belgium, Canada, Colombia, El Salvador, France, Germany, India, Ireland, New Zealand, Palestine, the Philippines, Slovakia, South Africa, Spain, Trinidad and Tobago, the United Arab Emirates, and the United Kingdom.
EFF staffers spoke to the masses in their own words via op-eds such as:
- The Well News, Feb. 6: “Net Neutrality Needs to Be Preserved” (Corynne McSherry)
- Ms. Magazine, Feb. 25: “Age-Verification Laws Seek to Erase LGBTQ+ Identity from the Internet” (Rin Alajaji & Paige Collings)
- Teen Vogue, April 25: “How to Protect Your Online Privacy: 3 Simple Steps to Stay Safe on the Internet” (Paige Collings)
- La Silla Vacía (Colombia), April 25: “Big Tech y financiación del periodismo: dependencia, trampas y caminos viables / Big Tech and journalism funding: dependence, traps and viable roads” (Veridiana Alimonti)
- The Register, Aug. 21: “The UK Online Safety Act is about censorship, not safety” (Paige Collings)
- Bay Area News Group, Aug. 21: "Trump is building ‘one interface to rule them all.’ It’s terrifying.” (Cindy Cohn)
- Bay Area News Group, Dec. 6: "San Jose’s vast surveillance network is watching you. Be afraid." (Lisa Femia)
And we ruled the airwaves on podcasts including:
- Firewalls Don’t Stop Dragons, Jan. 6: “ALPRs Are Everywhere” (Adam Schwartz & Gowri Nayar)
- CNN Terms of Service, Jan. 7: “If TikTok is Banned, What Happens to Creators and Fans?” (Eva Galperin)
- Richie & John, Jan. 9: “Meta's Content Changes: What It Means for LGBTQ+ Rights” (Jillian York)
- The Privacy Insider, Feb. 14: “Signal and Noise: The New Administration, Privacy, and Our Digital Rights” (Cindy Cohn)
- Tech Policy Press Podcast, Feb. 23: “Evaluating the First Systemic Risk and Audit Reports Under the Digital Services Act” (Svea Windwehr)
- Tech Policy Press Podcast, March 27: “About that Signal Chat” (Cooper Quintin)
- CNN Terms of Service, April 1: “Think Before You Ring: Keeping Home Surveillance Safe” Matthew Guariglia
- Malwarebytes Lock & Code, April 6: “Is your phone listening to you?” (Lena Cohen)
- Adult Site Broker, April 22: Age verification discussion (Lisa Femia)
- Plutopia News Network, May 19: “Settling the Digital Frontier” (Cindy Cohn)
- Guy Kawasaki’s Remarkable People, July 2: “Who Defends Your Digital Rights?” (Cindy Cohn)
- Tech Policy Press Podcast, July 13: “How US States Are Shaping AI Policy Amid Federal Debate and Industry Pushback” (Hayley Tsukayama)
- KALW Your Legal Rights, July 23: "Privacy in the Digital Age" (Sophia Cope & Tori Noble)
- StateScoop Priorities Podcast, July 30: “Cop or AI? This tech makes it hard to tell” (Matthew Guariglia)
- Malwarebytes' Lock and Code, Aug. 11: “‘The worst thing’ for online rights: An age-restricted grey web” (Jason Kelley)
- Firewalls Don’t Stop Dragons, Sept. 1: “Meet Rayhunter” (Cooper Quintin)
- It Could Happen Here, Sept. 9: “ICE Partners with Israeli Phone Hacking Spyware” (Cooper Quintin)
We're grateful to all the intrepid journalists who keep doing the hard work of reporting accurately on tech and privacy policy, and we encourage them to keep reaching out to us at press@eff.org.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.
Drone as First Responder Programs: 2025 in Review
Drone as first responder (DFR) adoption really took off in 2025. Though the concept has been around since 2018, this year saw more normalization of the technology, its integration into more real-time crime center structures, and the implementation of automated deployment of drones.
A DFR program features a fleet of camera-equipped drones, which can range from just a couple to dozens or more. These are deployed from a launch pad in response to 911 calls and other calls for service, sometimes operated by a drone pilot or, increasingly, autonomously directed to the call location. The appeal is the promise of increased “situational awareness” for officers headed to a call. This video offers a short explanation of DFR, and for a list of all of the cities we know use drones, including DFR programs, check out EFF’s Atlas of Surveillance.
Major Moves from the FAA and Forthcoming Federal IssuesIn order to deploy a drone beyond where it can be seen, operators need to receive a waiver from the Federal Aviation Administration (FAA), and all DFR programs require this. Police departments and technology vendors have complained that the process takes too long, and in May, FAA finalized reworked requirements, leading to a flood of waiver requests. An FAA spokesperson reported that in the first two months of the new waiver process, it had approved 410 such waivers, already accounting for almost a third of the approximately 1,400 DFR waivers that had ever been granted.
The federal government made other major moves on the drone front this year. A month after the new waivers went to effect, President Trump issued an Executive Order with aspirations for advancing the country’s drone industry. And at the end of the year, one of the largest drone manufacturers in the world and one of the biggest purveyors of law enforcement drones, DJI, will be banned from launching new products in the U.S. unless the federal government conducts a security audit that was mandated by the National Defense Authorization Act. However, at the moment, it doesn’t seem like that audit will happen, and if it doesn’t, it won’t be surprising to see other drone manufacturers leveraging the ban to boost their own products.
Automated Drone Deployment and Tech IntegrationsEarly iterations of drone use required a human operator, but this year, police drone companies began releasing automated flying machines that don’t require much human intervention at all. New models can rely on AI and automated directions to launch and direct a drone.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.
This was the year we saw DFR integrated with other tools and tech companies teamed up to bring even more powerful surveillance. Flock Safety added automated license plate readers (ALPR) to their drones. Axon and Skydio built on the partnership they launched in 2024. Drone manufacturer Brinc teamed up with Motorola Solutions on a DFR program. Drone company Paladin teamed up with a company called SkyeBrowse to add 3-D mapping of the environment to their list of features.
DFR also is increasingly part of the police plans for real-time crime centers, meaning that the footage being captured by these flying cameras is being integrated into other streams and analyzed in ways that we’re still learning about.
Transparency Around DFR DeploymentsTransparency around adoption, use, and oversight is always crucial, particularly when it comes to police surveillance, and EFF has been tracking the growth of DFR programs across the country. We encourage you to use your local public records laws to investigate them further. Examples of the kinds of requests and the responsive documents people have already received — including flight logs, policies, and other information — can be found on MuckRock.
The Problem with DronesFlying cameras are bad enough. They can see and record footage from a special vantage point, capturing video of your home, your backyard, and your movements that should require clear policies around retention, audits, and use, including when the cameras shouldn’t be recording. We’re also seeing that additional camera analysis and other physical features that can be added (so-called “payloads”) — like thermal cameras and even tear gas — can make drones even more powerful and that police technology companies are encouraging DFR as part of surveillance packages.
It's important that next year we all advocate for, and enforce, standards in adopting and using these DFRs. Check the Atlas to see if they are used where you live and learn more about drones and other surveillance tools on EFF’s Street-Level Surveillance Hub.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.
Are We Ready to Be Governed by Artificial Intelligence?
Artificial Intelligence (AI) overlords are a common trope in science-fiction dystopias, but the reality looks much more prosaic. The technologies of artificial intelligence are already pervading many aspects of democratic government, affecting our lives in ways both large and small. This has occurred largely without our notice or consent. The result is a government incrementally transformed by AI rather than the singular technological overlord of the big screen.
Let us begin with the executive branch. One of the most important functions of this branch of government is to administer the law, including the human services on which so many Americans rely. Many of these programs have long been operated by a mix of humans and machines, even if not previously using modern AI tools such as ...
Greening schools for climate-resilient, inclusive and liveable cities
Nature Climate Change, Published online: 29 December 2025; doi:10.1038/s41558-025-02519-3
Transforming school environments into nature-based climate shelters not only promotes cooling and greening under extreme heat, but also fosters quality education, ecological restoration, empowerment and reconnection with nature, and provides children with healthier, safer, more playful, equitable and climate-proof spaces.EFFector Audio Speaks Up for Our Rights: 2025 Year in Review
This year, you may have heard EFF sounding off about our civil liberties on NPR, BBC Radio, or any number of podcasts. But we also started sharing our voices directly with listeners in 2025. In June, we revamped EFFector, our long-running electronic newsletter, and launched a new audio edition to accompany it.
Providing a recap of the week's most important digital rights news, EFFector's audio companion features exclusive interviews where EFF's lawyers, activists, and technologists can dig deeper into the biggest stories in privacy, free speech, and innovation. Here are just some of the best interviews from EFFector Audio in 2025.
Unpacking a Social Media Spying SchemeEarlier this year, the Trump administration launched a sprawling surveillance program to spy on the social media activity of millions of noncitizens—and punish those who express views it doesn't like. This fall, EFF's Lisa Femia came onto EFFector Audio to explain how this scheme works, its impact on free speech, and, importantly, why EFF is suing to stop it.
"We think all of this is coming together as a way to chill people's speech and make it so they do not feel comfortable expressing core political viewpoints protected by the First Amendment," Femia said.
Challenging the Mass Surveillance of Drivers
But Lisa was hardly the only guest talking about surveillance. In November, EFF's Andrew Crocker spoke to EFFector about Automated License Plate Readers (ALPRs), a particularly invasive and widespread form of surveillance. ALPR camera networks take pictures of every passing vehicle and upload the location information of millions of drivers into central databases. Police can then search these databases—typically without any judicial approval—to instantly reconstruct driver movements over weeks, months, or even years at a time.
"It really is going to be a very detailed picture of your habits over the course of a long period of time," said Crocker, explaining how ALPR location data can reveal where you work, worship, and many other intimate details about your life. Crocker also talked about a new lawsuit, filed by two nonprofits represented by EFF and the ACLU of Northern California, challenging the city of San Jose's use of ALPR searches without a warrant.
Similarly, EFF's Mario Trujillo joined EFFector in early November to discuss the legal issues and mass surveillance risks around face recognition in consumer devices.
Simple Tips to Take Control of Your PrivacyOnline privacy isn’t dead. But tech giants have tried to make protecting it as annoying as possible. To help users take back control, we celebrated Opt Out October, sharing daily privacy tips all month long on our blog. In addition to laying down some privacy basics, EFF's Thorin Klosowski talked to EFFector about how small steps to protect your data can build up into big differences.
"This is a way to kind of break it down into small tasks that you can do every day and accomplish a lot," said Klosowski. "By the end of it, you will have taken back a considerable amount of your privacy."
User privacy was the focus of a number of EFFector interviews. In July, EFF's Lena Cohen spoke about what lawmakers, tech companies, and individuals can do to fight online tracking. That same month, Matthew Guariglia talked about precautions consumers can take before bringing surveillance devices like smart doorbells into their homes.
Digging Into the Next Wave of Internet CensorshipOne of the most troubling trends of 2025 was the proliferation of age verification laws, which require online services to check, estimate, or verify users’ ages. Though these mandates claim to protect children, they ultimately create harmful censorship and surveillance regimes that put everyone—adults and young people alike—at risk.
This summer, EFF's Rin Alajaji came onto EFFector Audio to explain how these laws work and why we need to speak out against them.
"Every person listening here can push back against these laws that expand censorship," she said. "We like to say that if you care about internet freedom, this fight is yours."
This was just one of several interviews about free speech online. This year, EFFector also hosted Paige Collings to talk about the chaotic rollout of the UK's Online Safety Act and Lisa Femia (again!) to discuss the abortion censorship crisis on social media.
You can hear all these episodes and future installments of EFFector's audio companion on YouTube or the Internet Archive. Or check out our revamped EFFector newsletter by subscribing at eff.org/effector!
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.
Procurement Power—When Cities Realized They Can Just Say No: 2025 in Review
In 2025, elected officials across the country began treating surveillance technology purchases differently: not as inevitable administrative procurements handled by police departments, but as political decisions subject to council oversight and constituent pressure. This shift proved to be the most effective anti-surveillance strategy of the year.
Since February, at least 23 jurisdictions fully ended, cancelled, or rejected Flock Safety ALPR programs (including Austin, Oak Park, Evanston, Hays County, San Marcos, Eugene, Springfield, and Denver) by recognizing surveillance procurement as political power, not administrative routine.
Legacy Practices & ObfuscationFor decades, cities have been caught in what researchers call "legacy procurement practices": administrative norms that prioritize "efficiency" and "cost thresholds" over democratic review.
Vendors exploit this inertia through the "pilot loophole." As Taraaz and the Collaborative Research Center for Resilience (CRCR) note in a recent report, "no-cost offers" and free trials allow police departments to bypass formal procurement channels entirely. By the time the bill comes due, the surveillance is already normalised in the community, turning a purchase decision into a "continuation of service" that is politically difficult to stop.
This bureaucracy obscures the power that surveillance vendors have over municipal procurement decisions. As Arti Walker-Peddakotla details, this is a deliberate strategy. Walker-Peddakotla details how vendors secure "acquiescence" by hiding the political nature of surveillance behind administrative veils: framing tools as "force multipliers" and burying contracts in consent agendas. For local electeds, the pressure to "outsource" government decision-making makes vendor marketing compelling. Vendors use "cooperative purchasing" agreements to bypass competitive bidding, effectively privatizing the policy-making process.
The result is a dangerous "information asymmetry" where cities become dependent on vendors for critical data governance decisions. The 2025 cancellations finally broke that dynamic.
The Procurement MomentThis year, cities stopped accepting this "administrative" frame. The shift came from three converging forces: audit findings that exposed Flock's lack of safeguards, growing community organizing pressure, and elected officials finally recognizing that saying "no" to a renewal was not just an option—it was the responsible choice.
When Austin let its Flock pilot expire on July 1, the decision reflected a political judgment: constituents rejected a nationwide network used for immigration enforcement. It wasn't a debate about retention rates; it was a refusal to renew.
These cancellations were also acts of fiscal stewardship. By demanding evidence of efficacy (and receiving none) officials in Hays County, Texas and San Marcos, Texas rejected the "force multiplier" myth. They treated the refusal of unproven technology not just as activism, but as a basic fiduciary duty. In Oak Park, Illinois, trustees cancelled eight cameras after an audit found Flock lacked safeguards, while Evanston terminated its 19-camera network shortly after. Eugene and Springfield, Oregon terminated 82 combined cameras in December. City electeds have also realized that every renewal is a vote for "vendor lock-in." As EPIC warns, once proprietary systems are entrenched, cities lose ownership of their own public safety data, making it nearly impossible to switch providers or enforce transparency later.
The shift was not universal. Denver illustrated the tension when Mayor Mike Johnston overrode a unanimous council rejection to extend Flock's contract. Council Member Sarah Parady rightly identified this as "mass surveillance" imposed "with no public process." This is exactly why procurement must be reclaimed: when treated as technical, surveillance vendors control the conversation; when recognized as political, constituents gain leverage.
Cities Hold the Line Against Mass SurveillanceEFF has spent years documenting how procurement functions as a lever for surveillance expansion, from our work documenting Flock Safety's troubling data-sharing practices with ICE and federal law enforcement to our broader advocacy on surveillance technology procurement reform. The 2025 victories show that when cities understand procurement as political rather than technical, they can say no. Procurement power can be the most direct route to stopping mass surveillance.
As cities move into 2026, the lesson is clear: surveillance is a choice, not a mandate, and your community has the power to refuse it. The question isn't whether technology can police more effectively; it's whether your community wants to be policed this way. That decision belongs to constituents, not vendors.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.
Defending Encryption in the U.S. and Abroad: 2025 in Review
Defending encryption has long been a bedrock of our work. Without encryption, it's impossible to have private conversations or private data storage. This year, we’ve seen attacks on these rights from all around the world.
Europe Goes All in On Breaking Encryption, Mostly Fails (For Now)The European Union Council has repeatedly tried to pass a controversial message scanning proposal, known as “Chat Control,” that would require secure messaging providers to scan the contents of messages. Every time this has come up since it was first introduced in 2022, it got batted down—because no matter how you slice it, client-side scanning breaks end-to-end encryption. The Danish presidency seemed poised to succeed in passing Chat Control this year, but strong pushback from across the EU caused them to reconsider and rework their stance. In its current state, Chat Control isn’t perfect, but it at least includes strong language to protect encryption, which is good news for users.
Meanwhile, France tried to pass its own encryption-breaking legislation. Unlike Chat Control, which pushed for client-side scanning, France took a different approach: allowing so-called “ghost participants,” where law enforcement could silently join encrypted chats. Thankfully, the French National Assembly did the right thing and rejected this dangerous proposal.
It wasn’t all wins, though.
Perhaps the most concerning encryption issue is still ongoing in the United Kingdom, where the British government reportedly ordered Apple to backdoor its optional end-to-end encryption in iCloud. In response, Apple disabled one of its strongest security features, Advanced Data Protection, for U.K. users. After some back and forth with the U.S., the U.K. allegedly rewrote the demand, to clarify it was limited to only apply to British users. That doesn’t make it any better. Tribunal hearings are planned for 2026, and we’ll continue to monitor developments.
Speaking of developments to keep an eye on, the European Commission released its “Technology Roadmap on Encryption” which discusses new ways for law enforcement to access encrypted data. There’s a lot that could happen with this roadmap, but let’s be clear, here: EU officials should scrap any roadmap focused on encryption circumvention and instead invest in stronger, more widespread use of end-to-end encryption.
U.S. Attempts Fall FlatThe U.S. had its share of battles, too. The Senate re-introduced the STOP CSAM Act, which threatened to compromise encryption by requiring encrypted communication providers to have knowledge about what sorts of content their services are being used to send. The bill allows encrypted services to raise a legal defense—but only after they’ve been sued. That's not good enough. STOP CSAM would force encryption providers to defend against costly lawsuits over content they can't see or control. And a jury could still consider the use of encryption to be evidence of wrongdoing.
In Florida, a bill ostensibly about minors' social media use also just so happened to demand a backdoor into encryption services—already an incredible overreach. It went further, attempting to ban disappearing messages and grant parents unrestricted access to their kids’ messages as well. Thankfully, the Florida Legislature ended without passing it.
It is unlikely these sorts of attempts to undermine encryption will suddenly stop. But whatever comes next, EFF will continue to stand up for everyone's right to use encryption to have secure and private online communications.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.
Friday Squid Blogging: Squid Camouflage
New research:
Abstract: Coleoid cephalopods have the most elaborate camouflage system in the animal kingdom. This enables them to hide from or deceive both predators and prey. Most studies have focused on benthic species of octopus and cuttlefish, while studies on squid focused mainly on the chromatophore system for communication. Camouflage adaptations to the substrate while moving has been recently described in the semi-pelagic oval squid (Sepioteuthis lessoniana). Our current study focuses on the same squid’s complex camouflage to substrate in a stationary, motionless position. We observed disruptive, uniform, and mottled chromatic body patterns, and we identified a threshold of contrast between dark and light chromatic components that simplifies the identification of disruptive chromatic body pattern. We found that arm postural components are related to the squid position in the environment, either sitting directly on the substrate or hovering just few centimeters above the substrate. Several of these context-dependent body patterns have not yet been observed in ...
IoT Hack
Someone hacked an Italian ferry.
It looks like the malware was installed by someone on the ferry, and not remotely.
EFF’s ‘How to Fix the Internet’ Podcast: 2025 in Review
2025 was a stellar year for EFF’s award-winning podcast, “How to Fix the Internet,” as our sixth season focused on the tools and technology of freedom.
It seems like everywhere we turn we see dystopian stories about technology’s impact on our lives and our futures—from tracking-based surveillance capitalism, to street level government surveillance, to the dominance of a few large platforms choking innovation, to the growing efforts by authoritarian governments to control what we see and say—the landscape can feel bleak. Exposing and articulating these problems is important, but so is envisioning and then building solutions. That’s where our podcast comes in.
EFF's How to Fix the Internet podcast offers a better way forward. Through curious conversations with some of the leading minds in law and technology, EFF Executive Director Cindy Cohn and Activism Director Jason Kelley explore creative solutions to some of today’s biggest tech challenges. Our sixth season, which ran from May through September, featured:
- 2025-htfi-kate-b-episode-art.png “Digital Autonomy for Bodily Autonomy” – We all leave digital trails as we navigate the internet—records of what we searched for, what we bought, who we talked to, where we went or want to go in the real world—and those trails usually are owned by the big corporations behind the platforms we use. But what if we valued our digital autonomy the way that we do our bodily autonomy? Digital Defense Fund Director Kate Bertash joined Cindy and Jason to discuss how creativity and community can align to center people in the digital world and make us freer both online and offline.
- 2025-htfi-molly-episode.png “Love the Internet Before You Hate On It” – There’s a weird belief out there that tech critics hate technology. But do movie critics hate movies? Do food critics hate food? No! The most effective, insightful critics do what they do because they love something so deeply that they want to see it made even better. Molly White—a researcher, software engineer, and writer who focuses on the cryptocurrency industry, blockchains, web3, and other tech joined Cindy and Jason to discuss working toward a human-centered internet that gives everyone a sense of control and interaction; open to all in the way that Wikipedia was (and still is) for her and so many others: not just as a static knowledge resource, but as something in which we can all participate.
- 2025-htfi-isabela-episode.png “Why Three is Tor's Magic Number” – Many in Silicon Valley, and in U.S. business at large, seem to believe innovation springs only from competition, a race to build the next big thing first, cheaper, better, best. But what if collaboration and community breeds innovation just as well as adversarial competition? Tor Project Executive Director Isabela Fernandes joined Cindy and Jason to discuss the importance of not just accepting technology as it’s given to us, but collaboratively breaking it, tinkering with it, and rebuilding it together until it becomes the technology that we really need to make our world a better place.
- 2025-htfi-harlo-episode.png “Securing Journalism on the ‘Data-Greedy’ Internet” – Public-interest journalism speaks truth to power, so protecting press freedom is part of protecting democracy. But what does it take to digitally secure journalists’ work in an environment where critics, hackers, oppressive regimes, and others seem to have the free press in their crosshairs? Freedom of the Press Foundation Digital Security Director Harlo Holmes joined Cindy and Jason to discuss the tools and techniques that help journalists protect themselves and their sources while keeping the world informed.
- 2025-htfi-deirdre-episode.png “Cryptography Makes a Post-Quantum Leap” – The cryptography that protects our privacy and security online relies on the fact that even the strongest computers will take essentially forever to do certain tasks, like factoring prime numbers and finding discrete logarithms which are important for RSA encryption, Diffie-Hellman key exchanges, and elliptic curve encryption. But what happens when those problems—and the cryptography they underpin—are no longer infeasible for computers to solve? Will our online defenses collapse? Research and applied cryptographer Deirdre Connolly joined Cindy and Jason to discuss not only how post-quantum cryptography can shore up those existing walls but also help us find entirely new methods of protecting our information.
- 2025-htfi-helen-episode.png “Finding the Joy in Digital Security” – Many people approach digital security training with furrowed brows, as an obstacle to overcome. But what if learning to keep your tech safe and secure was consistently playful and fun? People react better to learning and retain more knowledge when they're having a good time. It doesn’t mean the topic isn’t serious—it’s just about intentionally approaching a serious topic with joy. East Africa digital security trainer Helen Andromedon joined Cindy and Jason to discuss making digital security less complicated, more relevant, and more joyful to real users, and encouraging all women and girls to take online safety into their own hands so that they can feel fully present and invested in the digital world.
- 2025-htfi-kara-episode.png “Smashing the Tech Oligarchy” – Many of the internet’s thorniest problems can be attributed to the concentration of power in a few corporate hands: the surveillance capitalism that makes it profitable to invade our privacy, the lack of algorithmic transparency that turns artificial intelligence and other tech into impenetrable black boxes, the rent-seeking behavior that seeks to monopolize and mega-monetize an existing market instead of creating new products or markets, and much more. Tech journalist and critic Kara Swisher joined Cindy and Jason to discuss regulation that can keep people safe online without stifling innovation, creating an internet that’s transparent and beneficial for all, not just a collection of fiefdoms run by a handful of homogenous oligarchs.
- 2025-htfi-arvind-episode.jpg “Separating AI Hope from AI Hype” – If you believe the hype, artificial intelligence will soon take all our jobs, or solve all our problems, or destroy all boundaries between reality and lies, or help us live forever, or take over the world and exterminate humanity. That’s a pretty wide spectrum, and leaves a lot of people very confused about what exactly AI can and can’t do. Princeton Professor and “AI Snake Oil” publisher Arvind Narayanan joined Cindy and Jason to discuss how we get to a world in which AI can improve aspects of our lives from education to transportation—if we make some system improvements first—and how AI will likely work in ways that we barely notice but that help us grow and thrive.
- 2025-htfi-neuro-episode.jpg “Protecting Privacy in Your Brain” – Rapidly advancing "neurotechnology" could offer new ways for people with brain trauma or degenerative diseases to communicate, as the New York Times reported this month, but it also could open the door to abusing the privacy of the most personal data of all: our thoughts. Worse yet, it could allow manipulating how people perceive and process reality, as well as their responses to it—a Pandora’s box of epic proportions. Neuroscientist Rafael Yuste and human rights lawyer Jared Genser, co-founders of The Neurorights Foundation, joined Cindy and Jason to discuss how technology is advancing our understanding of what it means to be human, and the solid legal guardrails they're building to protect the privacy of the mind.
- 2025-htfi-brewster-episode.jpg “Building and Preserving the Library of Everything” – Access to knowledge not only creates an informed populace that democracy requires but also gives people the tools they need to thrive. And the internet has radically expanded access to knowledge in ways that earlier generations could only have dreamed of—so long as that knowledge is allowed to flow freely. Internet Archive founder and digital librarian Brewster Kahle joined Cindy and Jason to discuss how the free flow of knowledge makes all of us more free.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.
Politicians Rushed Through An Online Speech “Solution.” Victims Deserve Better.
Earlier this year, both chambers of Congress passed the TAKE IT DOWN Act. This bill, while well-intentioned, gives powerful people a new legal tool to force online platforms to remove lawful speech that they simply don't like.
The bill, sponsored by Senate Commerce Chair Ted Cruz (R-TX) and Rep. Maria Salazar (R-FL), sought to speed up the removal of troubling online content: non-consensual intimate imagery (NCII). The spread of NCII is a serious problem, as is digitally altered NCII, sometimes called “deepfakes.” That’s why 48 states have specific laws criminalizing the distribution of NCII, in addition to the long-existing defamation, harassment, and extortion statutes—all of which can be brought to bear against those who abuse NCII. Congress can and should protect victims of NCII by enforcing and improving these laws.
Unfortunately, TAKE IT DOWN takes another approach: it creates an unneeded notice-and-takedown system that threatens free expression, user privacy, and due process, without meaningfully addressing the problem it seeks to solve.
While Congress was still debating the bill, EFF, along with the Center for Democracy & Technology (CDT), Authors Guild, Demand Progress Action, Fight for the Future, Freedom of the Press Foundation, New America’s Open Technology Institute, Public Knowledge, Restore The Fourth, SIECUS: Sex Ed for Social Change, TechFreedom, and Woodhull Freedom Foundation, sent a letter to the Senate outlining our concerns with the proposal.
First, TAKE IT DOWN’s removal provision applies to a much broader category of content—potentially any images involving intimate or sexual content—than the narrower NCII definitions found elsewhere in the law. We worry that bad-faith actors will use the law’s expansive definition to remove lawful speech that is not NCII and may not even contain sexual content.
Worse, the law contains no protections against frivolous or bad-faith takedown requests. Lawful content—including satire, journalism, and political speech—could be wrongly censored. The law requires that apps and websites remove content within 48 hours or face significant legal risks. That ultra-tight deadline means that small apps or websites will have to comply so quickly to avoid legal risk, that they won’t be able to investigate or verify claims.
Finally, there are no legal protections for providers when they believe a takedown request was sent in bad faith to target lawful speech. TAKE IT DOWN is a one-way censorship ratchet, and its fast timeline discourages providers from standing up for their users’ free speech rights.
This new law could lead to the use of automated filters that tend to flag legal content, from commentary to news reporting. Communications providers that offer users end-to-end encrypted messaging, meanwhile, may be served with notices they simply cannot comply with, given the fact that these providers can’t view the contents of messages on their platforms. Platforms could respond by abandoning encryption entirely in order to be able to monitor content, turning private conversations into surveilled spaces.
We asked for several changes to protect legitimate speech that is not NCII, and to include common-sense safeguards for encryption. Thousands of EFF members joined us by writing similar messages to their Senators and Representatives. That resulted in several attempts to offer common-sense amendments during the Committee process.
However, Congress passed the bill without those needed changes, and it was signed into law in May 2025. The main takedown provisions of the bill will take effect in 2026. We’ll be pushing online platforms to be transparent about the content they take down because of this law, and will be on the watch for takedowns that overreach and censor lawful speech.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2025.
Urban VPN Proxy Surreptitiously Intercepts AI Chats
This is pretty scary:
Urban VPN Proxy targets conversations across ten AI platforms: ChatGPT, Claude, Gemini, Microsoft Copilot, Perplexity, DeepSeek, Grok (xAI), Meta AI.
For each platform, the extension includes a dedicated “executor” script designed to intercept and capture conversations. The harvesting is enabled by default through hardcoded flags in the extension’s configuration.
There is no user-facing toggle to disable this. The only way to stop the data collection is to uninstall the extension entirely.
[…]
The data collection operates independently of the VPN functionality. Whether the VPN is connected or not, the harvesting runs continuously in the background...
