EFF: Updates

Subscribe to EFF: Updates feed
EFF's Deeplinks Blog: Noteworthy news from around the internet
Updated: 6 hours 5 min ago

Congress Is Dropping the Ball with a Clean Extension of FISA

Fri, 03/20/2026 - 6:20pm

Two years ago, Congress passed the “Reforming Intelligence and Securing America” Act (RISAA) that included nominal reforms to Section 702 of the Foreign Intelligence Surveillance Act (FISA). The bill unfortunately included some problematic expansions of the law—but it also included a relatively big victory for civil liberties advocates: Section 702 authorities were only extended for two years, allowing Congress to continue the important work of negotiating a warrant requirement for Americans as well as some other critical reforms

However, Congress clearly did not continue this work. In fact, it now appears that Congress is poised to consider another extension of this program without even attempting to include necessary and common sense reforms. Most notably, Congress is not considering a requirement to obtain a warrant before looking at data on U.S. persons that was indiscriminately and warrantlessly collected. House Speaker Mike Johnson confirmed that “the plan is to move a clean extension of FISA … for at least 18 months.” 

Even more disappointing, House Judiciary Chair Jim Jordan, who has previously been a champion of both the warrant requirement and closing the data broker loophole, told the press he would vote for a clean extension of FISA, claiming that RISAA included enough reforms for the moment.

It’s important to note RISAA was just a reauthorization of this mass surveillance program with a long history of abuse. Prior to the 2024 reauthorization, Section 702 was already misused to run improper queries on peaceful protesters, federal and state lawmakers, Congressional staff, thousands of campaign donors, journalists, and a judge reporting civil rights violations by local police. RISAA further expanded the government’s authority by allowing it to compel a much larger group of people and providers into assisting with this surveillance. As we said when it passed, overall, RISAA is a travesty for Americans who deserve basic constitutional rights and privacy whether they are communicating with people and services inside or outside of the US.

Section 702 should not be reauthorized without any additional safeguards or oversight. Fortunately, there are currently three reform bills for Congress to consider: SAFE, PLEWSA, and GSRA. While none of these bills are perfect, they are all significantly better than the status quo, and should be considered instead of a bill that attempts no reform at all. 

Mass spying—accessing a massive amount of communications by and with Americans first and sorting out targets second and secretly—has always been a problem for our rights.  It was a problem at first when President George W. Bush authorized it in secret without Congressional or court oversight. And it remained a problem even after the passage of Section 702 in 2008 created the possibility of  some oversight. Congress was right that this surveillance is dangerous, and that's why it set Section 702 up for regular reconsideration. That reconsideration has not occurred, even as the circumstances of the NSA, Justice Department, and FBI leadership, have radically changed. Reform is long overdue, and now it's urgent.  

FCC Chair Carr’s Threats to Punish Broadcasters Are Unconstitutional

Fri, 03/20/2026 - 11:08am

EFF joined other digital rights and civil liberties organizations in calling out the unconstitutionality of Federal Communications Commission chair Brendan Carr’s recent threats to punish broadcasters for airing statements he disagrees with. 

Carr’s recent threats, like his past threats, are unconstitutional efforts to coerce news coverage that favors President Donald Trump. He wrongly claims that the FCC’s “public interest” standard allows him and the commission to revoke the licenses of broadcasters who publish news that is unflattering to the government is anathema to our country’s core constitutional values. 

The First Amendment constrains the FCC’s authority to force broadcasters to toe the government’s line, even though broadcast licensees are required to operate in the “public interest, convenience, and necessity.” Imposing restrictions on licensees’ speech, especially viewpoint-based limitations, are still subject to First Amendment scrutiny even if, in some circumstances, that scrutiny differs somewhat from that applied to non-broadcast media. And the “public interest” requirement, as it were, has never been interpreted to allow the type of viewpoint-based punishment that Carr has threatened here.  

Everyone agrees that news reporting should strive for accuracy, but Carr’s threats have little do with that. Instead, his allegations of "falsity" are a proxy for retaliation based on (1) Carr’s subjective policy disagreements; (2) any criticism of Trump and the administration broadly; (3) treatment of anything that is not the official US government line about the Iran War as “false.” 

We join the call for Carr to withdraw these threats.

 

Bonus Podcast Episode: Privacy’s Defender - Cindy Cohn with Cory Doctorow

Tue, 03/17/2026 - 4:03am

While How to Fix the Internet is on hiatus, we wanted to share a great conversation with you from last week. EFF Executive Director Cindy Cohn spoke with bestselling novelist, journalist, and EFF Special Advisor Cory Doctorow about Cindy’s new book, “Privacy’s Defender: My Thirty-Year Fight Against Digital Surveillance” (MIT Press).

%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F6c05474d-b4a1-4ffb-8ad8-943bccf09a10%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

    

You can also listen to this episode on the Internet Archive or watch the video on YouTube.

Part memoir, part battle cry, “Privacy’s Defender” is the story of Cindy’s fights alongside the visionaries who looked at the early internet and understood that the legal and political battles over this new technology - the Crypto Wars, the NSA’s dragnet, the FBI gag orders - were really over the future of free speech, privacy, and power for all. 

Cindy Cohn and Cory Doctorow at City Lights.jpg This conversation was recorded on Tuesday, March 10 in front of a packed house at San Francisco’s iconic City Lights Bookstore. For more about the book and Cindy’s national book tour - with stops in places including Seattle, Silicon Valley, Denver, Boston, Ann Arbor, Iowa City, Washington DC and New York City - check out https://www.eff.org/Privacys-Defender  

And finally, stay tuned to this feed; we’re working on a special podcast series featuring key players and moments from the book! 

Resources: 

Blocking the Internet Archive Won’t Stop AI, But It Will Erase the Web’s Historical Record

Mon, 03/16/2026 - 3:26pm

Imagine a newspaper publisher announcing it will no longer allow libraries to keep copies of its paper. 

That’s effectively what’s begun happening online in the last few months. The Internet Archive—the world’s largest digital library—has preserved newspapers since it went online in the mid-1990s. The Archive’s mission is to preserve the web and make it accessible to the public. To that end, the organization operates the Wayback Machine, which now contains more than one trillion archived web pages and is used daily by journalists, researchers, and courts.

But in recent months The New York Times began blocking the Archive from crawling its website, using technical measures that go beyond the web’s traditional robots.txt rules. That risks cutting off a record that historians and journalists have relied on for decades. Other newspapers, including The Guardian, seem to be following suit. 

For nearly three decades, historians, journalists, and the public have relied on the Internet Archive to preserve news sites as they appeared online. Those archived pages are often the only reliable record of how stories were originally published. In many cases, articles get edited, changed, or removed—sometimes openly, sometimes not. The Internet Archive often becomes the only source for seeing those changes. When major publishers block the Archive’s crawlers, that historical record starts to disappear.

The Times says the move is driven by concerns about AI companies scraping news content. Publishers seek control over how their work is used, and several—including the Times—are now suing AI companies over whether training models on copyrighted material violates the law. There’s a strong case that such training is fair use

Whatever the outcome of those lawsuits, blocking nonprofit archivists is the wrong response. Organizations like the Internet Archive are not building commercial AI systems. They are preserving a record of our history. Turning off that preservation in an effort to control AI access could essentially torch decades of historical documentation over a fight that libraries like the Archive didn’t start, and didn’t ask for. 

If publishers shut the Archive out, they aren’t just limiting bots. They’re erasing the historical record. 

Archiving and Search Are Legal 

Making material searchable is a well-established fair use. Courts have long recognized it’s often impossible to build a searchable index without making copies of the underlying material. That’s why when Google copied entire books in order to make a searchable database, courts rightly recognized it as a clear fair use. The copying served a transformative purpose: enabling discovery, research, and new insights about creative works. 

The Internet Archive operates on the same principle. Just as physical libraries preserve newspapers for future readers, the Archive preserves the web’s historical record. Researchers and journalists rely on it every day. According to Archive staff, Wikipedia alone links to more than 2.6 million news articles preserved at the Archive, spanning 249 languages. And that’s only one example. Countless bloggers, researchers, and reporters depend on the Archive as a stable, authoritative record of what was published online.

The same legal principles that protect search engines must also protect archives and libraries. Even if courts place limits on AI training, the law protecting search and web archiving is already well established.

The Internet Archive has preserved the web’s historical record for nearly thirty years. If major publishers begin blocking that mission, future researchers may find that huge portions of that historical record have simply vanished. There are real disputes over AI training that must be resolved in courts. But sacrificing the public record to fight those battles would be a profound, and possibly irreversible, mistake. 

The Foilies 2026

Sun, 03/15/2026 - 11:41am
Recognizing the Worst in Government Transparency 

The Foilies were written by EFF's Beryl Lipton, Dave Maass and Aaron Mackey and MuckRock's  Dillon Bergin, Kelly Kauffman and Anna Massoglia. 

For the last six years, a class of journalism students at the University of Nevada, Reno, has kicked off each semester by filing their first Freedom of Information Act (FOIA) requests.

The assignment: Request copies of complaints sent to the Federal Communications Commission (FCC) about their favorite TV show, a local radio station, or a major broadcast event, such as the Grammys or the Super Bowl halftime show. The students are learning that the federal government and every state have laws establishing the public's right to request and receive public records. It's a bedrock principle of democracy: If a government belongs to the people, so do its documents. 

In the past, the FCC always provided records within a few weeks, if not days. But that changed in September when students requested consumer complaints filed against NPR and PBS stations to see if there was absolutely anything at all to merit defunding public media. Seven months later — crickets. 

Now the students are learning to persevere even when public officials demonstrate an utter disdain for transparency. And The Foilies are here for it. 

Established in 2015, The Foilies are an annual project by the Electronic Frontier Foundation and MuckRock to recognize the agencies, officials and contractors that thwart the public's right to know. We give out these tongue-in-cheek "awards" during Sunshine Week (March 15-21), a collective effort by media and advocacy organizations to highlight the importance of open government.  

This year, we've got a few "winners" whose behavior defies belief. 

But it's not all negative. Those same Reno students are also assigned to file public records requests for restaurant health inspections. This semester, the records started to show up in their inboxes within 20 minutes. 

If every agency followed Northern Nevada Public Health's example, we could sunset this Sunshine Week project. 

Quick links:

The Love Letters Award - Gov. Greg Abbott 

Last spring, the office of Texas Gov. Greg Abbott withheld communications between himself and one of the state’s most powerful business figures, Elon Musk. The office claimed that the communications were exempt from public records law because they would reveal confidential legal and policy discussions, including how the state entices private companies to do business in Texas, or “intimate and embarrassing” information.

The claims were unelaborated boilerplate language based on exemptions in Texas’ public records law. But if you’re wondering what "intimate" and “embarrassing” exchanges Abbott and Elon Musk shared over email, you may be waiting a while. 

Last fall, the Office of the Texas Attorney General ordered Texas Gov. Greg Abbott’s office to release nearly 1,400 pages of communications between Abbott and Musk. About 1,200 of those pages were fully redacted–just sheets of gray obscuration. The records that were released don’t reveal much more than an invitation to a happy hour or a reminder of the next SpaceX launch.

The Surcharge, Eh? Award - Vancouver, B.C. 

Vancouver residents must now pay twice for public records. Despite taxes already funding the creation and storage of government records, the City Council approved charging people $10 Canadian (about $7.33 in the United States) every time they ask for “non-personal” public records.

Officials claim the fee is necessary to deter misuse and cover some administrative costs. The only people abusing anything, however, are the officials who imposed this tax on the public. The message Vancouver is sending is as crisp as a newly minted $10 note: Secrecy is a higher priority than public accountability.

The Shady Screenshot Award - Department of Homeland Security 

The Department of Homeland Security’s banner year of lawlessness included backsliding on its transparency obligations.

In response to a request from the nonprofit American Oversight, DHS stated that it was no longer automatically archiving text messages sent between officials. The department clarified that it had a new, and much worse, records retention policy. Instead of archiving officials’ text messages as the agency had done before, DHS now asks officials to take screenshots of any text messages conducting government business on their work phones. 

It’s hard to see the change as anything more than a giant middle finger to the public, especially because the Federal Records Act requires agencies to retain all records officials create while conducting their public duties, regardless of format. We won’t hold our breath waiting on DHS officials to dutifully press the volume and power button on their phones to record every text message they send and receive. 

The Discardment of Government Efficiency Award - DOGE 

As the Trump administration took over last year, there was a looming threat over government transparency: the so-called Department of Government Efficiency, also known as DOGE. 

Billionaire Elon Musk, soon to be the de facto leader of DOGE, proudly claimed “there should be no need for FOIA requests” and “all government data should be default public for maximum transparency.” What quickly became apparent was there may be no need for FOIA requests, because there may be no FOIA officers to fulfill those requests.

DOGE quickly went to work slashing through the federal government, including seizing control of the U.S. Institute of Peace. Part of the takeover included restricting access to the agency’s FOIA system and firing the employees responsible for fulfilling FOIA requests, according to a letter sent to Bloomberg reporter Jason Leopold. Meanwhile, when CNN filed a FOIA request with the Office of Personnel Management (OPM) for information about Musk and DOGE's security clearance, they were told: "Good luck with that," because the FOIA officers had been fired. 

DOGE also argued that its own records are exempt from FOIA under the Presidential Records Act, meaning records cannot be accessed until five years after President Donald Trump is out of office. 

While DOGE “doesn’t exist” anymore according to the OPM, there remains a lasting dark mark on the state of FOIA and records management. 

The Secret Eyes in the Sky Award - Chula Vista Police Department, Calif.

In 2021, Arturo Castañares at La Prensa San Diego filed a request with the Chula Vista Police Department for copies of videos taken by drones responding to 911 calls as part of the city's "drone as first responder" program. One of the goals was to evaluate the technology’s efficacy and risks to civil liberties. 

The city worked overtime to maintain the secrecy of the footage at the same time officials publicly touted the drones as a revolution in policing. That’s some impressive trust-us-but-don’t-verify chutzpah.

The city argued that every second of every video recorded by its drones was categorically off limits because they were law enforcement investigative records. They even got a trial court to initially buy the argument.

But an appellate court ruled that the investigatory records exemption is more limited, shielding only drone footage that is part of a criminal investigation or evidence of a suspected crime. Footage of wildfires, car wrecks, wild animal sightings and the like are not criminal investigations and must be disclosed.

The California Supreme Court rejected both of CVPD's appeals and a trial court bench slapped the city for inaccurate and incomplete court filings. In the end, the city had to shell out north of $400,000 to its outside lawyers, and then paid Castañares’ lawyers more than $500,000 when he prevailed. 

So what were Chula Vista police hiding? A bunch of routine service calls, such as unverified reports of a vehicle fire and a vehicle collision.

Now, according to La Prensa's reporting, officials are trying to raid a public safety fund created by voters to reimburse the city for the cost of its ill-advised secrecy. 

The City of Darkness Award - Richmond, Va. 

Richmond’s creation of a new FOIA Library may seem like a step toward transparency, but there are questions about the city’s commitment after it left the same officials subject to records requests in charge of curating which records might be released.

Faced with a plan to post all of the city’s eligible public records released under Virginia’s “sunshine” law, the Richmond City Council instead opted to go with the mayor’s alternative proposal. That plan lets the mayor’s administration — the same one that might be the subject of those records — decide what’s worth posting to the library.

Instead of providing access to all public records that the city released under the Virginia Freedom of Information Act, the library will only contain a subset that officials believe meet certain criteria, including records that the administration deems "relevant" to city business or that would aid "accountability.” The city cites concerns that "transparency without context" might be too confusing for the average citizen. Forgive us for having more faith in Richmond residents than its leaders do.

The city’s secrecy shenanigans extend beyond the FOIA library.

In an ongoing legal battle, attorneys representing Richmond asked a judge to prohibit former city FOIA officer Connie Clay from filing FOIA requests seeking information about her firing, and sought a gag order to prevent her from talking about the case. Clay alleges she was fired for insisting the city comply with public records law, describing what she calls a “chaotic and mismanaged” and illegal FOIA request process. Rather than agree to a $250,000 settlement, Richmond has spent more than $633,000 in taxpayer funds on legal costs. The trial and the FOIA library launch are both slated for the summer of 2026. 

The Flock You Awards - Multiple Winners

If you live in one of the 5,000 cities where surveillance vendor Flock Safety claims to have established relationships with local cops, you may have noticed the sudden installation of little black cameras on poles by the side of the road or at intersections. These are automated license plate readers (ALPRs), which document every vehicle that passes within view, including the license plate, color, make, model and other distinguishing characteristics. The images are fed to Flock's servers, and the company encourages police to share the images collected locally with law enforcement throughout the country. Each year, law enforcement agencies across the country conduct tens of millions of searches of each other's databases. 

In 2025, journalists and privacy advocates started filing public records requests with agencies to get spreadsheets called a "Network Audit," which shows every search, including who ran it and why. Accessing these audits uncovered abuse of the system including: investigating a woman who received an abortion, targeting immigrants, surveilling protesters, and running racist searches targeting Roma people

In response, some cities have terminated their contracts with Flock Safety. Other law enforcement agencies, and Flock itself, have gone a different direction: 

Taunton Police Department, Mass.: The police department told the ACLU of Massachusetts to cough up $1.8 million if the organization wanted its network audit logs–the highest public records fee we documented this year. The civil liberties group filed requests with agencies throughout the state for the audits, and most agencies handed over the spreadsheets for free and with little fanfare. Taunton, however, said it would take 20,000 hours to process the request, at $86.57 an hour. 

Orange County Sheriff's Department, Calif.: The Orange County Sheriff gave a number of reasons it wouldn't release the network audit logs in response to a public records request. The most inane (and misspelled one): It would "disincentive law enforcement from conducting such research." Aren't cops the ones who say if you’re not doing anything wrong, you've got nothing to hide? Well, well, well, how the tables have turned.

Flock Safety: The company responded to criticisms of its ALPR network by sending legal threats aimed at trying to silence its critics. First, the company used a bogus trademark claim to threaten DeFlock.me–a crowdsourced map of ALPR. (EFF represented its creator.) Then it hired a company to try to get the hosts of HaveIBeenFlocked.com, which hosts an interface for searching these network audits, to remove the site from the internet. 

The Database Deletion Award - Muneeb and Sohaib Akhter, formerly of Opexus

Brothers Muneeb and Sohaib Akhter are accused of essentially hitting delete on government data, destroying access to information contained in millions of records. 

The government hired a federal contractor called Opexus, which hosts data and provides services to dozens of federal agencies. The company employed the Akhter siblings, though in February 2025, Opexus learned about the brothers’ previous convictions for wire fraud and obstructing justice. Soon after, the company fired the pair. But, according to prosecutors, the two decided to double down on being wildly unsuited for administrative access to government records systems. 

The Akhters immediately turned around and retaliated “by accessing computers without authorization, issuing commands to prevent others from modifying the databases before deletion, deleting databases, stealing information, and destroying evidence of their unlawful activities," according to the U.S. Department of Justice.

The two have been accused of deleting 96 government databases, many of which contained FOIA records and sensitive investigative files. Their indictment alleges that a minute later, one brother queried an artificial intelligence tool for “how to clear system logs following the deletion of databases.” The brothers are also charged with stealing government records and conspiracy to commit computer fraud. 

The Brothers Akhter allegedly took mere moments to destroy untold amounts of information that belonged to the public. Though they could face decades in prison, the public may never know the extent of the damage.

Want more FOIA horror stories? Check out The Foilies archives!

EFF Launches New Fight to Free the Law

Fri, 03/13/2026 - 3:02pm

EFF is filing against the Consumer Product Safety Council (CPSC) to ensure that the public has full access to the laws that govern us.

Our client Public.Resource.Org (Public Resource), a tiny non-profit founded by open records advocate Carl Malamud, has a mission that’s both simple and powerful: to make government information more accessible. Public Resource acquires and makes available online a wide variety of public documents such as tax filings, government-produced videos, and federal rules about safety and product designs. Those rules are initially created through private standards organizations and later incorporated into federal law. Such documents are often difficult to access otherwise, meaning the public cannot read, share, or comment on them. 

Working with Harvard Law School’s Cyberlaw Clinic, Public Resource has been submitting Freedom of Information Act requests to the CPSC requesting copies of the legally binding safety codes for children’s products—an area of law of intense interest to child safety advocates and consumer advocates, not to mention the families who use those products. But CPSC says it can’t release the codes, because the private association that coordinated their initial development insists that it retains copyright in them even after they have been adopted into law. That’s like saying a lobbyist who drafted a new tax law gets to control who reads it or shares it, even after it becomes a legal mandate.

Faced with similar claims, some courts, including the Court of Appeals for the Fifth Circuit, have held that the safety codes lose copyright protection when they are incorporated into law. Others, like the D.C. Circuit (in a case EFF defended on Public Resource’s behalf), have held that even if the standards lose copyright once they are incorporated into law, making them fully accessible and usable online is a lawful fair use. 

Now EFF has teamed up with the Cyberlaw Clinic to continue the fight. We’re asking a court to rule that copyright is no barrier to accessing and sharing the rules that are supposed to ensure the safety of our built environment and the products we use every day. With the rule of law under assault around the nation, it is more important than ever to defend our ability to read and speak the law, without restrictions.

A.B. 1043’s Internet Age Gates Hurt Everyone

Thu, 03/12/2026 - 3:59pm

EFF has long warned against age-gating the internet. Such mandates strike at the foundation of the free and open internet. They create unnecessary and unconstitutional barriers for adults and young people to access information and express themselves online. They hurt small and open-source developers. And none of the available age verification options are perfect in terms of protecting private information, providing access to everyone, and safely handling sensitive data. 

Last year, EFF raised concerns about A.B. 1043 as one of several bills in the California legislature that took the wrong approach to protecting young people online—by focusing on censorship rather than privacy. Now that A.B. 1043 is set to go into effect in 2027, we've received a lot of questions about its possible effects. 

A.B. 1043’s Censorship Trap

Even proposals that may not explicitly mandate age verification, such as A.B. 1043, can still create many of the same censorship problems. A.B. 1043 requires all operating systems and app stores to create age bracketing systems that will segment their users based on their ages. Users are then required to provide operating systems and apps their birth date or age so that they can be placed in their respective age bracket. A.B. 1043 also requires application and software developers to collect this age bracket information when a user want to use that software or application.

A.B. 1043 treats the age-bracket signal sent by a user as giving the application or service actual knowledge of users’ ages. Knowledge that the user is a minor could provide the basis for liability under other laws, such as California Age-Appropriate Design Code.

The result is a recipe for censorship. Applications and software developers for operating systems may interpret A.B. 1043 and its potential enforcement by the California Attorney General as requiring them to exclude users who say they are minors or who don’t fit in a specific age bracket they believe is acceptable to use their application or software. But minors have a First Amendment right to access the vast majority of these apps and services. What California has done is essentially outsource censorship to developers, who are likely to lean into over-censorship.

Broad Language Undercuts Policy Goals

A.B. 1043’s one-size-fits-all approach is also problematic because it disregards the many ways in which we make and use digital tools. It assumes the internet and digital devices begin and end with the dominant technology companies and device makers, when we know that’s not the case. Additionally, many families share devices, especially in low-income households. These proposals do not account for situations where there is more than one user of a device.

Additionally, broad proposals that demand the implementation of such censorship tools under the guise of protecting young people's safety force developers to reach for imperfect solutions—or risk being found non-compliant and pushed out of markets. Many of these mandates imagine technology that does not currently exist. Such poorly thought-out mandates, in truth, cannot achieve the purported goal of age verification. Often, they are easy to circumvent and many also expose consumers to real data breach risk.

Squeezing Small and Open-Source Developers Hurts Everyone

A.B. 1043’s burdens fall particularly heavily on developers who aren’t at large, well-resourced companies, such as those developing open-source software. Not recognizing the diversity of software development when thinking about liability in these proposals effectively limits software choices—which is especially harmful at a time when computational power is being rapidly concentrated in the hands of the few. This harms users' and developers' right to free expression, their digital liberties, privacy, and ability to create and use open platforms. It also, perversely, entrenches the dominance of major operating system developers and device makers.

A.B. 1043 and similar proposals also raise considerable implementation issues because they cast a potentially wide net. A.B. 1043, for example, carves out “broadband internet access service," "telecommunications service,” and the “use of a physical product,” whereas “mobile devices” and “computers” are covered. However, so many devices could fall into these categories; people consider smart watches to be computers, for example. Virtually every digital device that runs software built in the past three decades could fall into that category. This means that consumers may have to submit age information to more companies than ever, again increasing the possibility of data misuse and data breach.

There Is Still A Better Way

Legislators do not need to sacrifice their constituents' First Amendment rights and privacy to make a safer internet, but they can address many of the harms these proposals seek to mitigate. Many lawmakers have recognized these approaches, such as data minimization, in their proposals. Rather than creating age gates, a well-crafted privacy law that empowers all of us—young people and adults alike—to control how our data is collected and used would be a crucial step in the right direction.

Rep. Finke Was Right: Age-Gating Isn’t About Kids, It’s About Control

Thu, 03/12/2026 - 3:32pm

When Rep. Leigh Finke spoke last month before the Minnesota House Commerce Finance and Policy Committee to testify against HF1434, a broad-sweeping proposal to age-gate the internet, she began with something disarming: agreement.

“I want to support the basic part of this,” she said, the shared goal of protecting young people online. Because that is not controversial: everyone wants kids to be safe. But HF1434, Minnesota’s proposed age-verification bill, simply won’t “protect children.” It mandates that websites hosting speech that is protected by the First Amendment for both adults and young people to verify users’ identities, often through government IDs or biometric data. As we’ve discussed before, the bill’s definition of speech that lawmakers deem “harmful to minors” is notoriously broad—broad enough to sweep in lawful, non-pornographic speech about sexual orientation, sexual health, and gender identity.

Rep. Finke, an openly transgender lawmaker, next raised a point that her critics have since tried to distort: age-verification laws like the Minnesota bill are already being used to block young LGBTQ+ people from exercising their First Amendment rights to access information that may be educational, affirming, or life-saving. Referencing the Supreme Court case Free Speech Coalition v. Paxton, she noted that state attorneys general have been “almost jubilant” about the ability to use these laws to restrict queer youth from accessing content. “We know that ‘prurient interest’ could be for many people, the very existence of transgender kids,” she added, referring to the malleable legal standard that would govern what content must be age-gated under the law. 

But despite years’ worth of evidence to back her up, Finke has faced a wave of attacks from countless media outlets and religious advocacy groups for her statements. Rep. Finke’s testimony was repeatedly mischaracterized as not having young people’s best interests in mind, when really she was accurately describing the lived reality of LGBTQ+ youth and advocating in support of their access to vital resources and community.

In fact, this backlash proves her point. Beyond attempting to silence queer voices and to scare other legislators from speaking up against these laws, it reveals how age-verification mandates are part of a larger effort to give the government much greater control of what young people are allowed to say, read, or see online. 

Rep. Finke was also right that these proposals are bad policy—they prevent all young people from finding community online—and that they violate young people’s First Amendment rights.

Why FSC v. Paxton Matters

Rep. Finke was similarly right to bring up the Paxton case, because beyond the troubling Supreme Court precedent it produced, Texas’s age-verification law also drew eager support from an extraordinary number of amicus briefs from anti-LGBTQ organizations (some even designated hate groups by the Southern Poverty Law Center). 

In FSC v. Paxton, the Supreme Court gave Texas the green light to require age verification for sites where at least one-third of the content is sexual material deemed “harmful to minors,” which generally means explicit sexual content. This ruling, based on how young people do not have a First Amendment right to access explicit sexual content, allows states to enact onerous age-verification rules that will block adults from accessing lawful speech, curtail their ability to be anonymous, and jeopardize their data security and privacy. These are real and immense burdens on adults, and the Court was wrong to ignore them in upholding Texas’ law. 

But laws enacted by other states and Minnesota HF 1434 go further than the Texas statute. Rather than restricting minors’ from accessing sexual content, these proposals expand what the state deems “harmful to minors” to include any speech that may reference sex, sexuality, gender, and reproductive health. But young people have a First Amendment right to both speak on those topics and to access information online about them.

We will continue to fight against all online age restrictions, but bills like Minnesota’s HF 1434, which seek to restrict minors from accessing speech about their bodies, sexuality, and other truthful information, are especially pernicious.

EFF and Rep. Finke are on the same page here: age verification mandates create immense harm to our First Amendment rights, our right to privacy, as well as our online safety and security. These proposals also fully ignore the reality that LGBTQ young people often rely on the internet for information they cannot get elsewhere. 

But the Paxton case, and the coalition behind it, illustrates exactly how these laws can be weaponized. They weren’t there just to stand up for young people’s privacy online—they were there to argue that the state has a compelling interest in shielding minors from material that, in practice, often includes LGBTQ content. Ultimately, these groups would like to age-gate not just porn sites, but also any content that might discuss sex, sexuality, gender, reproductive health, abortion, and more.

Using Children as Props to Enact Censorship 

The coalition of organizations that filed amicus briefs in support of Texas’s age verification law tells us everything we need to know about the true intentions behind legislating access to information online: censorship, surveillance, and control. After all, if the race to age-gate the internet was purely about child safety, we would expect its strongest supporters to be child-development experts or privacy advocates. Instead, the loudest advocates are organizations dedicated to policing sexuality, attacking LGBTQ+ folks and reproductive rights, and censoring anything that doesn’t fit within their worldview.

Below are some of the harmful platforms that the organizations supporting the age-gating movement are advancing, and how their arguments echo in the attacks on Rep. Finke today:

Policing sexuality, bodily autonomy, and reproductive rights

Many of the organizations backing age-verification laws have spent decades trying to restrict access to accurate sexual health information and reproductive care.

Groups like Exodus Cry, for example, who filed a brief in support of the Texas AG in the SCOTUS case, frame pornography as part of a broader moral crisis. Founded by a “Christian dominionist” activist, Exodus Cry advocates for the criminalization of porn and sex work, and promotes a worldview that defines “sexual immorality” as any sexual activity outside marriage between one man and one woman. Its leadership describes the internet as a battleground in a “pornified world” that has to be reclaimed. 

Another brief in support of the age-verification law was filed by a group of organizations including the Public Advocate of the United States (an SPLC-designated hate group) and America’s Future. America’s Future is an organization that was formed to “revitalize the role of faith in our society” and fiercely advocates in favor of trans sports bans

These groups see age-verification laws as attractive solutions because they create a legal mechanism to wall off large swaths of content that merely mentions sex from not only young people but millions of adults, too.

Attacking LGBTQ+ Rights

Several of the most prominent legal advocates behind age-verification laws have also led the crusade against LGBTQ+ equality. The internet that these groups envision is one that heavily censors critical and even life-saving LGBTQ+ resources, community, and information. 

The Alliance Defending Freedom (ADF), for instance (which is another SPLC-designated hate group), built its reputation on litigation aimed at rolling back LGBTQ+ protections—including  allowing businesses to refuse service to same-sex couples, criminalizing same-sex relationships abroad, and restricting transgender rights

The internet that these groups envision is one that heavily censors critical and even life-saving LGBTQ+ resources, community, and information. 

Then there’s other groups like Them Before Us and Women’s Liberation Front, both of which submitted amici in support of the Texas Attorney General and are devoted to upending LGBTQ+ rights in the United States. Them Before Us says it’s “committed to putting the rights and well-being of children ahead of the desires and agendas of adults.” But it’s also running a campaign to “End Obergefell,” the 2015 Supreme Court case that upheld the right to same-sex marriage, and has been on the cutting edge of transphobic campaigning and pseudoscientific fearmongering about IVF and surrogacy. The Women’s Liberation Front, on the other hand, is an organization that has a long track record of supporting transphobic policies such as bathroom bills, bans on gender-affirming healthcare, and efforts to define “sex” strictly as the biological sex assigned at birth. 

Through cases like FSC v. Paxton, groups like these three continue to advance a vision of society that creates government mandates to enforce their worldviews over personal freedom, while hiding behind a shroud of concern for children’s safety. But when they also describe LGBTQ+ people as “evil” threats to children and run countless campaigns against their human rights, they are being clear about their intentions. This is why we continue to say: the impact of age verification measures goes beyond porn sites.

Expanding censorship beyond the internet into real-life public spaces

As we’ve said for years now, the push to age-gate the internet is part of a broader campaign to control what information people can access in public life both on- and offline. Many of the same organizations advancing these proposals claim to be acting on behalf of young people, but their arguments consistently use children as props to justify giving the government more control over speech and information.

Many of the organizations advocating for online age verification have also supported book bans, attacks on DEI policies and education, and efforts to remove LGBTQ+ materials from schools and libraries. Two of the organizations who supported the Texas Attorney General, Citizens Defending Freedom and Manhattan Institute, have led campaigns around the country to “abolish DEI” and ban classical books like “The Bluest Eye” by Toni Morrison from school libraries. These efforts are not different from the efforts to restrict access to the internet—they reflect a broader strategy to restrict access to ideas or information that these groups find objectionable. And they discourage free thought, inquiry, and the ability for people to decide how to live their lives. 

These campaigns rely on the same core argument: that certain ideas are inherently dangerous to young people and must therefore be restricted. But that framing misrepresents an important reality: if lawmakers genuinely want to address harms that young people experience online, they should start by listening to young people themselves. When EFF spoke directly with young people about their online experiences, they overwhelmingly rejected restrictions on their access to the internet and came back with nuanced and diverse perspectives. Once that principle—that certain ideas are inherently dangerous—is accepted, the internet, once a symbol of free expression, connection, creativity, and innovation, becomes the next logical target. 

Once that principle—that certain ideas are inherently dangerous—is accepted, the internet, once a symbol of free expression, connection, creativity, and innovation, becomes the next logical target. 

This also wouldn’t be the first time a vulnerable group is used as a prop to advance internet censorship laws. We’ve seen this playbook during the debate over FOSTA/SESTA, where many of the same advocates claimed to speak for trafficking victims/survivors and sex workers, while pushing legislation that ultimately censored online speech and harmed the very communities it invoked. It’s a familiar pattern: invoke a vulnerable group, frame certain speech as a threat, and use that as a way to expand government control over the flow of information. And as we said in the fight against FOSTA: if lawmakers are serious about addressing harms to particular communities, they should start by talking to those communities. This means that lawmakers seeking to address online harms to young people should be talking to young people, not groups who claim their interests. 

Rep. Finke Was Not Radical. She Was Right.

The Paxton case, and the coalition backing age verification laws in the U.S., shows us exactly why the messaging around these laws draws superficial support from parents and lawmakers. But we’ve heard the quiet part said out loud before. Marsha Blackburn, a sponsor of the federal Kids Online Safety Act, has said that her goal with the legislation was to address what she called “the transgender” in society. When lawmakers and advocacy groups frame queer existence itself as a threat to young people, age-verification laws become ideological enforcement instead of regulatory policy.

When lawmakers and advocacy groups frame queer existence itself as a threat to young people, age-verification laws become ideological enforcement instead of regulatory policy.

In defending free speech, privacy, and the right of young people to access truthful information about themselves, Rep. Leigh Finke was not radical—she was right. She was warning that broad, ideologically driven laws will be used to erase, silence, and isolate young people under the banner of child protection. 

What’s at stake in the fight against age verification is not just a single bill in a single state, or even multiple states, for that matter. It’s about whether “protecting children” becomes a legal pretext for embedding government control over the internet to enforce specific moral and religious judgments—judgments that deny marginalized people access to speech, community, history, and truth—into law. 

And more people in public office need the courage of Rep. Finke to call this out.

Certbot and Let's Encrypt Now Support IP Address Certificates

Wed, 03/11/2026 - 6:32pm

(Note: This post is also cross-posted on the Let's Encrypt blog)

As announced earlier this year, Let's Encrypt now issues IP address and six-day certificates to the general public. The Certbot team here at the Electronic Frontier Foundation has been working on two improvements to support these features: the --preferred-profile flag released last year in Certbot 4.0, and the --ip-address flag, new in Certbot 5.3. With these improvements together, you can now use Certbot to get those IP address certificates!

If you want to try getting an IP address certificate using Certbot, install version 5.4 or higher (for webroot support with IP addresses), and run this command:

sudo certbot certonly --staging \
--preferred-profile shortlived \
--webroot \
--webroot-path <filesystem path to webserver root> \
--ip-address <your ip address>

Two things of note:

  • This will request a non-trusted certificate from the Let's Encrypt staging server. Once you've got things working the way you want, run without the --staging flag to get a publicly trusted certificate.
  • This requests a certificate with Let's Encrypt's "shortlived" profile, which will be good for 6 days. This is a Let's Encrypt requirement for IP address certificates.

As of right now, Certbot only supports getting IP address certificates, not yet installing them in your web server. There's work to come on that front. In the meantime, edit your webserver configuration to load the newly issued certificate from /etc/letsencrypt/live/<ip address>/fullchain.pem and /etc/letsencrypt/live/<ip address>/privkey.pem.

The command line above uses Certbot's "webroot" mode, which places a challenge response file in a location where your already-running webserver can serve it. This is nice since you don't have to temporarily take down your server.

There are two other plugins that support IP address certificates today: --manual and --standalone. The manual plugin is like webroot, except Certbot pauses while you place the challenge response file manually (or runs a user-provided hook to place the file). The standalone plugin runs a simple web server that serves a challenge response. It has the advantage of being very easy to configure, but has the disadvantage that any running webserver on port 80 has to be temporarily taken down so Certbot can listen on that port. The nginx and apache plugins don't yet support IP addresses.

You should also be sure that Certbot is set up for automatic renewal. Most installation methods for Certbot set up automatic renewal for you. However, since the webserver-specific installers don't yet support IP address certificates, you'll have to set a --deploy-hook that tells your webserver to load the most up-to-date certificates from disk. You can provide this --deploy-hook through the certbot reconfigure command using the rest of the flags above.

We hope you enjoy using IP address certificates with Let's Encrypt and Certbot, and as always if you get stuck you can ask for help in the Let's Encrypt Community Forum.

Government Spying 🤝 Targeted Advertising | EFFector 38.5

Wed, 03/11/2026 - 10:50am

Have you ever seen a really creepy targeted ad online? One that revealed just how much these companies know about your life? It's unsettling enough to see how much companies know about you—but now we have confirmation that the government is also tapping the advertising surveillance machine to get your data. We're explaining the dangers of targeted advertising and location tracking, and the latest in the fight for privacy and free speech online, with our EFFector newsletter.

JOIN OUR NEWSLETTER

For over 35 years, EFFector has been your guide to understanding the intersection of technology, civil liberties, and the law. This issue covers a victory for protesters seeking to hold police accountable, a troubling conflict over the Department of Defense's use of AI, and how advertising surveillance enables government surveillance.

Prefer to listen in? Big news: EFFector is now available on all major podcast platforms! In this episode we chat with EFF Staff Attorney Lena Cohen about how targeted advertising can reveal your location to federal law enforcement. You can find the episode and subscribe in your podcast player of choice

%3Ciframe%20height%3D%22200px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F924c6faa-1887-475b-a72c-0be4b6f68ba5%3Fdark%3Dfalse%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

   

Want to stay in the fight for privacy and free speech online? Sign up for EFF's EFFector newsletter for updates, ways to take action, and new merch drops. You can also fuel the fight against online surveillance when you support EFF today!

Copyright Bullying vs. Religious Freedom

Tue, 03/10/2026 - 7:06pm

The government should not help a religious institution to punish or deter members from inquiring about their faith. Yet, once again, the Watch Tower Bible and Tract Society is trying to use flimsy copyright claims to exploit the special legal tools available to copyright owners in order to unmask anonymous online speakers. And, once again, EFF has stepped in to urge the courts not to give Watch Tower’s attempts the force of law, with the help of local counsel Jonathan Phillips of Phillips & Bathke, P.C.

EFF’s client, J. Doe, is a member of the Jehovah’s Witnesses who became interested in the history of the organization’s public statements, and how they’ve changed over time. They created research tools to analyze those documents and ultimately created a website, JWS Library, allowing others to use those tools and verify their findings through an archive that included documents suppressed by the church. Doe and others discovered prophecies that failed to come true, erasure of a leader’s disgrace, increased calls for obedience and donations, and other insights about the Jehovah’s Witnesses’ practices. Doe also used machine translation on a foreign-language document to help the community understand what the church was saying to different audiences and also to help understand potential changes in the organization’s attitudes towards dissent.

Within the church, dissent or even asking questions has often been punished by labeling members as apostates and ostracizing—or “disfellowshipping”— them. As a result, Doe and others choose to speak anonymously to avoid retaliation that could cost them family, friend, and professional relationships.

There is no law against questioning the Jehovah’s Witnesses. Instead, Watch Tower argues that Doe’s activities constitute copyright infringement and seeks to use the special process provided in the Digital Millennium Copyright Act (DMCA) to unmask them. It sent DMCA subpoenas to Google and Cloudflare, seeking information that would help them uncover Doe’s identity.

The problem for Watch Tower is that Doe’s research and commentary are clear fair uses allowed under copyright law. The First Amendment does not permit the unmasking of anonymous speakers based on such weak claims. Indeed, the First Amendment protects anonymous speakers precisely because some would be deterred from speaking if they faced retribution for doing so.

EFF stands with those who question the claims of those in power and who share the tools and knowledge needed to do so. We urge the judges in the Southern District of New York to quash these improper subpoenas and not allow copyright to be used to suppress important, legitimate speech.

Think Twice Before Buying or Using Meta’s Ray-Bans

Tue, 03/10/2026 - 5:02pm

Over the last decade or so, the tech industry has tried, and mostly failed, to make “smart glasses”—tech-infused glasses with cameras, AI, maps, displays, and more—a thing. But over the past year, products like Meta’s Ray-Ban Display Glasses and Oakley’s Meta Glasses have gone from a curious niche to the mainstream

Before you strap a dashcam to your face and sprint out into the world filming everything and everyone in your life, there are some civil liberties and privacy concerns to consider before buying or using a pair.

Meta is the biggest company that makes these sorts of glasses and their partnerships with Ray-Ban and Oakely are the most popular options, so we’ll be mostly focusing on them here. Others, like models from Snapchat are similar in form but far less ubiquitous. But Meta won’t hold this space for long. Google’s already announced a partnership with Warby Parker for their “AI-powered smart glasses,” and there are rumors around a competing product from Apple

With that, let’s dive into some of the considerations you should make before purchasing a pair.

If You’re Thinking About Buying Smart Glasses You’re likely not the only one who can see (and hear) your footage

The photos and videos you record with most smartglasses will likely be stored online at some point in the process. On Meta’s offerings, unless you are livestreaming, media you capture when you press the camera button is kept on the glasses until you import them onto your phone, but media is imported automatically by default into the Meta AI mobile app, which is required to set up the glasses. 

You can't use any AI features locally on the glasses. So anytime you use AI features, like when you say, “Hey Meta, start recording,” the footage is fed to Meta. You can use the glasses without the Meta AI app entirely, but considering you can’t easily download footage from the glasses to your phone without it, most people will likely use the app.

Some videos are fed to Meta for AI training, and we know at least in some cases that those videos go through human review. An investigation by Swedish newspapers found that workers were reviewing and annotating camera footage, which includes all sorts of sensitive videos, including nudity, sex, and going to the bathroom. Meta claimed to the BBC that this is in accordance with its terms of use, all in the name of AI training, which states:

In some cases, Meta will review your interactions with AIs, including the content of your conversations with or messages to AIs, and this review may be automated or manual (human).

This all means that Meta and their third-party contractors will have access to at least some of what you record, and it’s very hard as a user to know where footage goes, who will have access to it, and what they will do with it. When you save footage to your phone’s camera roll, which is where the Meta AI app stores content, that might also be sent to Apple or Google’s servers, depending on your settings. Employees at these companies can then possibly access that media, and it could be shared with law enforcement.

The recorded audio from conversations with Meta AI are also saved by default, and if you don’t like that, tough luck, unless you go in and manually delete them every time you say something.

Filming all the time is even more privacy invasive than you think

A common argument in favor of using the cameras in smartglasses is that phones and cameras can do this too, and it’s never been a problem. 

But smartglasses are designed to resemble regular glasses, to the point where most reviews point out how friends didn’t notice that they had cameras embedded in them. They’re designed to be invisible to those being recorded outside of a small indicator light when they’re recording video footage (that cheap hacks can disable). Whereas it is often obvious that a person is recording if they pull their phone out of their pocket and point it at someone else.

They’re designed to be invisible to those being recorded outside of a small indicator light when they’re recording video footage

Moreover, constant recording of everything in public spaces can create all sorts of potential privacy problems, some more obvious than others. This is another way that cameras on glasses are different from cameras on phones: it is far easier to constantly record one’s whereabouts with the former than the latter. If you continuously record, maybe you just happen to catch someone entering their passcode or password onto their phone or computer at a coffee shop, or broadcast someone’s bank details when you’re standing in line at an ATM. That doesn’t even begin to get into when smartglasses are intentionally used for less socially responsible means. And some people may forget to turn off their smartglasses when they enter a private space like a bathroom.  

And if you find yourself caught on someone’s camera, there’s not much you can do in recourse. If you do notice a stranger recording you, it’s up to you to intervene and ask not to be included in that footage, which can easily turn awkward or confrontational.

Our expectations of privacy shift when we’re in public, but bystanders in many cases will still have privacy interests. Public spaces are a place where you will be seen, but that shouldn’t mean it’s suddenly okay to catalog and identify everyone.

Consider the company’s the track record and public statements

Meta, Google, Apple—perhaps one benefit of all the major tech companies entering this market is that we already have a good idea of how much they tend to respect the privacy of their users or the openness of their platforms. Spoiler, it’s often not much.

Meta has a long history of privacy invasive technologies and practices. We’ve heard rumblings that Meta hopes to add face recognition to its smartglasses, preferably, “during a dynamic political environment where many civil society groups that we would expect to attack us would have their resources focused on other concerns.” Yikes. This is a monumentally bad idea that should be abandoned by Meta and any of its competitors considering a similar feature. But regardless of whether they launch this feature, it’s a pretty clear indication of where Meta wants these sorts of devices to go. 

If You Have Smartglasses Already Opt out of sharing with Meta where you can

You can disable a couple of the features where unnecessary data is sent to Meta. In the Meta AI app, under the device settings, there’s a privacy page where you can disable sharing additional data, and more importantly, turn off “Cloud media,” where your photos and videos are sent to Meta’s cloud for processing and temporary storage. 

Decide your use-case and stick to it

These glasses can be useful for filming a variety of activities. We’ve seen fascinating scenes of tattoo artists doing their work (with client’s permission), and it doesn’t take a stretch of the imagination to see how people might use it to film extreme sports. Even on an everyday level, you might find them useful for capturing holidays, birthdays, and all sorts of other private occasions. 

But if you buy these glasses for a specific, mostly private purpose, it is probably best to stick to that, instead of wearing them everywhere and recording everything you do.

Follow the rules of a businesses and social expectations

You often have a right to record in public spaces, but that doesn’t mean other people will like it. Businesses, including restaurants and stores, may want nothing to do with continuous filming and may either post a sign asking you not to use smartglasses, or ask you to stop. This may reflect the preferences not just of the business owner, but the people around you. And don’t use glasses to record when you enter other people’s private spaces like bathrooms or changing rooms.

It’s also a good idea to check in with friends and family before tapping that record button at a social gathering. Some people may not be as comfortable with these glasses as they are with other recording equipment.

Consider blurring strangers if you’re going to upload video

Blurring video footage isn’t an easy task, but if you’re considering uploading footage from something like a protest, it may be worth the effort to do so (apps like Meta’s Edits simplify this process, as do some other video sites, like YouTube). Some people don’t want the government to see their faces at protests, and might be afraid to attend if other people are uploading their faces.

Some people don’t want the government to see their faces at protests, and might be afraid to attend if other people are uploading their faces.

It would be better if Meta leveraged its AI features to offer this sort of feature automatically, especially with livestreaming. It’s not that outlandish of a request, as it seems like the company tries to blur faces automatically in footage it captures for annotation, though it’s not always reliable. After all, Google began redacting faces in Street View years ago, following privacy concerns from groups like EFF.

Resist face recognition

Adding facial recognition technology to smartglasses would obliterate the privacy of everyone. We cannot let companies push face recognition into these glasses, and as a user, you should make your voice clear that this is not something you want.

Smartglasses don’t have to be used to decimate the privacy of anyone you encounter during the day. There are legitimate uses out there, but it’s up to those who use them to respect the social norms of the spaces they enter and the people they encounter.

The Government Must Not Force Companies to Participate in AI-powered Surveillance

Tue, 03/10/2026 - 4:39pm

The rapidly escalating conflict between Anthropic and the Pentagon, which started when the company refused to let the government use its technology to spy on Americans, has now gone to court. The Department of Defense retaliated by designating the company a “supply chain risk” (SCR). Now, Anthropic is asking courts to block the designation, arguing that the First Amendment does not permit the government to coerce a private actor to rewrite its code to serve government ends.

We agree.

As EFF, the Foundation for Individual Rights and Expression, and multiple other public interest organizations explained in a brief filed in support of Anthropic’s motion, the development and operation of large language models involve multiple expressive choices protected by the First Amendment. Requiring a company to rewrite its code to remove guardrails means compelling different expression, a clear constitutional violation. Further, the public record shows that the SCR designation is intended to punish the company both for pushing back and for its CEO’s public statements explaining that AI may supercharge surveillance practices that current law has proven ill-equipped to address.

As we also explain, the company’s concerns about how the government will use its technology are well-founded. The U.S. government has a long history of illegally surveilling its citizens without adequate judicial oversight based on questionable interpretations of its Constitutional and statutory obligations. The Department of Defense acquires vast troves of personal information from commercial entities, including individuals’ physical location, social media, and web browsing data. Other government agencies continue to collect and query vast quantities of Americans’ information, including by acquiring information from third party data brokers.

A growing body of social science research illustrates the chilling effects of these pervasive activities. Fearing retribution for unpopular views, dissenters stay silent. And AI only exacerbates the problem. AI can quickly analyze the government’s massive datasets or combine that information with data scraped off the internet, purchased through the commercial data broker market, or from local police surveillance devices and use all of that data to construct a comprehensive picture of a person’s life and infer sensitive details like their religious beliefs, medical conditions, political opinions, or even sex partners. For example, an agency could use AI to infer an individual’s association with a particular mosque based on data showing that they visited its website, followed its social media accounts, and were located near the mosque during religious services. AI can also deanonymize online speech by using public information to unmask anonymous users.

It is easy to conceive how an agency, a government employee with improper intent, or a malicious hacker could exploit these capabilities to monitor public discourse, preemptively squelch dissent, or persecute people from marginalized communities. Against this background and absent meaningful changes to the governing national security laws and judicial oversight structure, it is entirely reasonable for Anthropic—or any other company—to insist on its own guardrails.

Without action from Congress, the task of protecting your privacy has fallen in large part to Big Tech—something no one wants, including Big Tech. But if Congress won’t do it, companies like Anthropic must be allowed to step in, without facing retribution.

The SAFE Act is an Imperfect Vehicle for Real Section 702 Reform

Mon, 03/09/2026 - 4:27pm

The SAFE act, introduced by Senators Mike Lee (R-UT) and Dick Durbin (D-IL), is the first of many likely proposals we will see to reauthorize Section 702 of the Foreign Intelligence Surveillance Act (FISA) Amendments Act of 2008—and while imperfect, it does propose a litany of real and much-needed reforms of Big Brother’s favorite surveillance authority. 

The irresponsible 2024 reauthorization of the secretive mass surveillance authority Section 702 not only gave the government two more years of unconstitutional surveillance powers, it also made the policy much worse. But, now people who value privacy and the rule of law get another bite at the apple. With expiration for Section 702 looming in April 2026, we are starting to see the emergence of proposals for how to reauthorize the surveillance authority—including calls from inside the White House for a clean reauthorization that would keep the policy unchanged. EFF has always had a consistent policy: Section 702 should not be reauthorized absent major reforms that will keep this tactic of foreign surveillance from being used as a tool of mass domestic espionage. 

What is Section 702?

Section 702 was intended to modernize foreign surveillance of the internet for national security purposes. It allows collection of foreign intelligence from non-Americans located outside the United States by requiring U.S.-based companies that handle online communications to hand over data to the government. As the law is written, the intelligence community (IC) cannot use Section 702 programs to target Americans, who are protected by the Fourth Amendment’s prohibition on unreasonable searches and seizures. But the law gives the intelligence community space to target foreign intelligence in ways that inherently and intentionally sweep in Americans’ communications.

We live in an increasingly globalized world where people are constantly in communication with people overseas. That means, while targeting foreigners outside the U.S. for “foreign intelligence Information” the IC routinely acquires the American side of those communications without a probable cause warrant. The collection of all that data from U.S telecommunications and internet providers results in the “incidental” capture of conversations involving a huge number of people in the United States.

But, this backdoor access to U.S. persons’ data isn’t “incidental.” Section 702 has become a routine part of the FBI’s law enforcement mission. In fact, the IC’s latest Annual Statistical Transparency Report documents the many ways the Federal Bureau of Investigation (FBI) uses Section 702 to spy on Americans without a warrant. The IC lobbied for Section 702 as a tool for national security outside the borders of the U.S., but it is apparent that the FBI uses it to conduct domestic, warrantless surveillance on Americans. In 2021 alone, the FBI conducted 3.4 million warrantless searches of US person’s 702 data.

The Good

Let’s start with the good things that this bill does. These are reforms EFF has been seeking for a long time and their implementation would mean a big improvement in the status quo of national security law.

First, the bill would partially close the loophole that allows the FBI and domestic law enforcement to dig through 702-collected data’s “incidental” collection of the U.S. side of communications. The FBI currently operates with a “finders keeper” mentality, meaning that because the data is pre-collected by another agency, the FBI believes it can operate with almost no constraints on using it for other purposes. The SAFE act would require a warrant before the FBI looked at the content of these collected communications. As we will get to later, this reform does not go nearly far enough because they can query to see what data on a person exists before getting a warrant, but it is certainly an improvement on the current system. 

Second, the bill addresses the age-old problem of parallel construction. If you’re unfamiliar with this term, parallel construction is a method by which intelligence agencies or domestic law enforcement find out a piece of information about a subject through secret, even illegal or unconstitutional methods. Uninterested in revealing these methods, officers hide what actually happened by publicly offering an alternative route they could have used to find that information. So, for instance, if police want to hide the fact that they knew about a specific email because it was intercepted under the authority of Section 702, they might use another method, like a warranted request to a service provider, to create a more publicly-acceptable path to that information. To deal with this problem, the SAFE Act mandates that when the government seeks to use Section 702 evidence in court, it must disclosure the source of this evidence “without regard to any claim that the information or evidence…would inevitably have been discovered, or was subsequently reobtained through other means.” 

Next, the bill proposes a policy that EFF and other groups have nonetheless been trying to get through Congress for over five years: ending the data broker loophole. As the system currently stands, data brokers who buy and sell your personal data collected from smartphone applications, among other sources, are able to sell that sensitive information, including a phone’s geolocation, to the law enforcement and intelligence agencies. That means that with a bit of money, police can buy the data (or buy access to services that purchase and map the data) that they would otherwise need a warrant to get. A bill that would close this loophole, the Fourth Amendment is Not For Sale Act passed through the House in 2024 but has yet to be voted on by the Senate. In the meantime, states have taken it upon themselves to close this loophole with Montana being the first state to pass similar legislation in May 2025. The SAFE Act proposes to partially fix the loophole at least as far as intelligence agencies are concerned. This fix could not come soon enough—especially since the Office of the Director of National Intelligence has signaled their willingness to create one big, streamlined, digital marketplace where the government can buy data from data brokers. 

Another positive thing about the SAFE Act is that it creates an official statutory end to surveillance power that the government allowed to expire in 2020. In its heyday, the intelligence community used Section 215 of the Patriot Act to justify the mass collection of communication records like metadata from phone calls. Although this legal authority has lapsed, it has always been our fear that it will not sit dormant forever and could be reauthorized at any time. This new bill says that its dormant powers shall “cease to be in effect” within 180 of the SAFE Act being enacted. 

What Needs to Change 

The SAFE Act also attempts to clarify very important language that gauges the scope of the surveillance authority: who is obligated to turn over digital information to the U.S. government. Under Section 702, “electronic communication service providers” (ECSP) are on the hook for providing information, but the definition of that term has been in dispute and has changed over time—most recently when a FISA court opinion expanded the definition to include a category of “secret” ECSPs that have not been publicly disclosed.  Unfortunately, this bill still leaves ambiguity in interpretation and an audit system without a clear directive for enforcing limitations on who is an ECSP or guaranteeing transparency. 

As mentioned earlier, the SAFE Act introduces a warrant requirement for the FBI to read the contents of Americans’ communications that have been warrantlessly collected under Section 702. However, the law does not in its current form require the FBI to get a warrant before running searches identifying whether Americans have communications present in the database in the first place. Knowing this information is itself very revealing and the government should not be able to profit from circumventing the Fourth Amendment. 

When Congress reauthorized Section 702 in 2014, they did so through a piece of policy called the Reforming Intelligence and Securing America Act (RISAA). This bill made 702 worse in several ways, one of the most severe being that it expanded the legal uses for the surveillance authority to include vetting immigrants. In an era when the United States government is rounding up immigrants, including people awaiting asylum hearings, and which U.S officials are continuously threatening to withhold admission to the United States from people whose politics does not align with the current administration, RISAA sets a dangerous precedent. Although RISAA is officially expiring in April, it would be helpful for any Section 702 reauthorization bill to explicitly prohibit the use of this authority for that reason. 

Finally, in the same way that the SAFE Act statutorily ends the expired Section 215 of the Patriot Act, it should also impose an explicit end to “Abouts collection” a practice of collecting digital communications, not if their from suspected people, but if their are “about” specific topics. This practice has been discontinued, but still sits on the books, just waiting to be revamped. 

Privacy's Defender: Launch Party in Berkeley

Mon, 03/09/2026 - 3:29pm

We're celebrating the launch of Privacy's Defender, a new book by EFF Executive Director Cindy Cohn on Thursday, March 12—and we want you to join us! Cindy has tangled with the feds, fought for your data security, and argued before judges to protect our access to science and knowledge on the internet. In Privacy's Defender she asks: can we still have private conversations if we live our lives online?

Join the festivities for a live conversation between Cindy Cohn and Annalee Newitz followed by a book signing with Cindy.

REGISTER TODAY! 

$20 General Admission for 1
$30 Discounted tickets for 2
$12.50 Student Ticket
All proceeds benefit EFF's mission.

Want your own copy of Privacy's Defender?
Save $10 when you preorder the book with your ticket purchase

WHEN:
Thursday, March 12th, 2026
6:30 pm to 9:30 pm

WHERE:
Ciel Creative Space
Entrance located at:
940 Parker St, Berkeley, CA 94710

6:30 PM Doors Open
7:15 PM Program Begins


About the book

Throughout her career, Cindy Cohn has been driven by a fundamental question: Can we still have private conversations if we live our lives online? Privacy’s Defender chronicles her thirty-year battle to protect our right to digital privacy and shows just how central this right is to all our other rights, including our ability to organize and make change in the world.

Shattering the hypermasculine myth that our digital reality was solely the work of a handful of charismatic tech founders, the author weaves her own personal story with the history of Crypto Wars, FBI gag orders, and the post-9/11 surveillance state. She describes how she became a seasoned leader in the early digital rights movement, as well as how this work serendipitously helped her discover her birth parents and find her life partner. Along the way, she also details the development of the Electronic Frontier Foundation, which she grew from a ragtag group of lawyers and hackers into one of the most powerful digital rights organizations in the world.

Part memoir and part legal history for the general reader, the book is a compelling testament to just how hard-won the privacy rights we now enjoy as tech users are, but also how crucial these rights are in our efforts to combat authoritarianism, grow democracy, and strengthen other human rights. Learn about the Privacy's Defender book tour.

Parking

Street parking is available around the building.

Accessibility

The main event space is wheelchair accessible, on concrete. Lively music will be playing, and the speakers will be using a microphone, so louder volumes are expected. EFF is committed to improving accessibility for our events. If you will be attending in-person and need accommodation, or have accessibility questions prior to the event, please contact events@eff.org.

Food and Drink

Wine & Beer will be available for purchase. Cellarmaker Brewing Co., located next door to Ciel Space, will be serving food until 8:00 pm. 

Questions?

Email us at events@eff.org.

About the Speakers

Cindy Cohn
Cindy Cohn is the Executive Director of the Electronic Frontier Foundation. From 2000-2015 she served as EFF’s Legal Director as well as its General Counsel.  Ms. Cohn first became involved with EFF in 1993, when EFF asked her to serve as the outside lead attorney in Bernstein v. Dept. of Justice, the successful First Amendment challenge to the U.S. export restrictions on cryptography. 

Ms. Cohn has been named to TheNonProfitTimes 2020 Power & Influence TOP 50 list, honoring 2020's movers and shakers.  In 2018, Forbes included Ms. Cohn as one of America's Top 50 Women in Tech. The National Law Journal named Ms. Cohn one of 100 most influential lawyers in America in 2013, noting: "[I]f Big Brother is watching, he better look out for Cindy Cohn." She was also named in 2006 for "rushing to the barricades wherever freedom and civil liberties are at stake online."  In 2007 the National Law Journal named her one of the 50 most influential women lawyers in America. In 2010 the Intellectual Property Section of the State Bar of California awarded her its Intellectual Property Vanguard Award and in 2012 the Northern California Chapter of the Society of Professional Journalists awarded her the James Madison Freedom of Information Award.  

Ms. Cohn is the author of the professional memoir, called Privacy's Defender to be published by MIT Press in March, 2026. She is also the co-host of EFF's award-winning podcast, How to Fix the Internet.  

 

Annalee Newitz
Annalee Newitz writes science fiction and nonfiction. They are the author of four novels: Automatic Noodle, The Terraformers, The Future of Another Timeline, and Autonomous, which won the Lambda Literary Award. As a science journalist, they are the author of Stories Are Weapons: Psychological Warfare and the American Mind, Four Lost Cities: A Secret History of the Urban Age and Scatter, Adapt and Remember: How Humans Will Survive a Mass Extinction, which was a finalist for the LA Times Book Prize in science. They are a writer for the New York Times and elsewhere, and have a monthly column in New Scientist. They have published in The Washington Post, Slate, Scientific American, Ars Technica, The New Yorker, and Technology Review, among others. They were the co-host of the Hugo Award-winning podcast Our Opinions Are Correct, and have contributed to the public radio shows Science Friday, On the Media, KQED Forum, and Here and Now. Previously, they were the founder of io9, and served as the editor-in-chief of Gizmodo.

EFFecting Change: Privacy's Defender

Mon, 03/09/2026 - 1:39pm

Join EFF Executive Director Cindy Cohn in conversation with 404 Media Cofounder Jason Koebler to discuss Privacy's Defender: My Thirty-Year Fight Against Digital Surveillance, Cindy’s personal story of standing up to the Justice Department, taking on the NSA, and tangling with the FBI to protect our right to digital privacy. The highly anticipated book asks the fundamental question: Can we still have private conversations if we live our lives online? Join the livestream for a live discussion followed by by Q&A.

EFFecting Change Livestream Series:
Privacy's Defender
Thursday, March 19th
11:00 AM - 12:00 PM Pacific
This event is LIVE and FREE!



Accessibility

This event will be live-captioned and recorded. EFF is committed to improving accessibility for our events. If you have any accessibility questions regarding the event, please contact events@eff.org.

Event Expectations

EFF is dedicated to a harassment-free experience for everyone, and all participants are encouraged to view our full Event Expectations.

Upcoming Events

Want to make sure you don’t miss our next livestream? Here’s a link to sign up for updates about this series: eff.org/ECUpdates. If you have a friend or colleague that might be interested, please join the fight for your digital rights by this link: eff.org/EFFectingChange. Thank you for helping EFF spread the word about privacy and free expression online.

Recording

We hope you and your friends can join us live! If you can't make it, we’ll post the recording afterward on YouTube and the Internet Archive!

About the Speakers

 

 Cindy Cohn 
Cindy Cohn is the Executive Director of the Electronic Frontier Foundation. From 2000-2015 she served as EFF’s Legal Director as well as its General Counsel.  Ms. Cohn first became involved with EFF in 1993, when EFF asked her to serve as the outside lead attorney in Bernstein v. Dept. of Justice, the successful First Amendment challenge to the U.S. export restrictions on cryptography. Ms. Cohn has been named to TheNonProfitTimes 2020 Power & Influence TOP 50 list, honoring 2020's movers and shakers.  In 2018, Forbes included Ms. Cohn as one of America's Top 50 Women in Tech. The National Law Journal named Ms. Cohn one of 100 most influential lawyers in America in 2013, noting: "[I]f Big Brother is watching, he better look out for Cindy Cohn." She was also named in 2006 for "rushing to the barricades wherever freedom and civil liberties are at stake online."  In 2007 the National Law Journal named her one of the 50 most influential women lawyers in America. In 2010 the Intellectual Property Section of the State Bar of California awarded her its Intellectual Property Vanguard Award and in 2012 the Northern California Chapter of the Society of Professional Journalists awarded her the James Madison Freedom of Information Award.  

 Jason Koebler 
Jason Koebler is a cofounder of 404 Media, a journalist-owned investigative tech publication. He reports on surveillance and privacy, the ways that artificial intelligence is changing the internet, labor, and society, and consumer rights. Before 404 Media, he was the editor-in-chief of Motherboard, VICE's technology publication and an executive producer on Encounters, a Netflix documentary about the search for alien life.





Admiring Our Heroes for International Women’s Day: Celebrating Women Who Have Received EFF Awards 

Fri, 03/06/2026 - 7:57pm

For the last hundred years, women have had pivotal and far too often unsung roles in building and shaping the technology that we now use every day. Many have heard of Ada Lovelace’s contributions to computer programming, but far fewer know Mary Allen Wilkes, a prominent modern programmer who wrote much of the software for the LINC, one of the world’s first interactive personal computers (it could fit in a single office and cost $40,000, but it was the 60’s). Decades earlier, when the first all-electronic, digital Eniac computer was built in the 40’s, the “software” for it was written by women: Kathleen McNulty, Jean Jennings, Betty Snyder, Marlyn Wescoff, Frances Bilas and Ruth Lichterman. 

It’s thankfully become more common knowledge that actor and inventor Hedy Lamarr co-created the concept of "frequency-hopping" that became a basis for radio systems from cell phones to wireless networking systems. But too few know Laila Ohlgren, who in the 1970’s solved a major problem with the development of mobile networks and phones by recognizing that dialed numbers could be stored and sent all at once with a “call button,” rather than sent one number at a time, which created connection issues before a call was even made. 

Women in tech deserve more and brighter spotlights. At EFF, we’ve had the honor of celebrating some of our heroes at our annual EFF Awards, including many women who are leading the digital rights community. For International Women’s Day, we’re highlighting the contributions of just a few of these recipients from the last decade, whose work to protect privacy, speech, and creativity online has had a global impact.

Carolina Botero (EFF Award Winner, 2024) 

Carolina Botero is a leader in the fight for digital rights in Latin America. For over a decade, she led the Colombia-based Karisma Foundation and cultivated its regional and international impact. Botero and Karisma helped connect indigenous peoples to the internet and made it possible to contribute content to Wikipedia in their native language, expanding access to both history and modern information. They built alliances to combat disinformation, pushed for legal tools to protect cultural and heritage institutions from digital blackholes, and were, and remain, a necessary voice speaking for human rights in the online world. EFF worked closely with Karisma and Botero to help free Colombian graduate student Diego Gomez, who shared another student’s Master’s thesis with colleagues over the internet. Diego’s story demonstrates what can go wrong when nations enact severe penalties for copyright infringement, and thanks to work from Karisma, many partners, and many EFF supporters, he was cleared of the criminal charges that he faced for this harmless act of sharing scholarly research.

Carolina Botero receiving her EFF Award

Botero stepped down from the role in 2024, opening the door for a new generation. While her work continues—she’s currently on the advisory board of CELE, the Centro de Estudios en Libertad de Expresión—her EFF Award was well-deserved based on her strong and inspiring legacy for those in Latin America and beyond who advocate for a digital world that enhances rights and empowers the powerless. Learn more about Botero on her EFF Awards page and the recap of the 2024 event

Chelsea Manning (EFF Award Winner, 2017)

Chelsea Manning became famous as a whistleblower: In 2010, she disclosed classified Iraq War documents, including a video of the killings of Iraqi civilians and two Reuters reporters by U.S. troops. These documents exposed aspects of U.S. operations in Iraq and Afghanistan that infuriated the public and embarrassed the government. But she is also a transparency and transgender rights advocate, network security expert, author, and former U.S. Army intelligence analyst. 

Manning joined the military in 2007. Her role as an intelligence analyst to an Army unit in Iraq in 2009 gave her access to classified databases, but more importantly, it gave her a uniquely comprehensive view of the war in Iraq, and she became increasingly disillusioned and frustrated by what she saw, versus what was being shared. In 2010, she approached major news outlets hoping to give information to them that would reveal a new side of the war to the public. Ultimately, she shared the documents with Wikileaks. 

Manning’s bravery did not end there. When she was arrested a few months later, she endured "cruel, inhuman and degrading" treatment, according to the UN Special Rapporteur on torture. She was locked up alone for 23 hours a day over an 11-month period, before her trial. The mistreatment resulted in public outcry and advocacy by organizations like Amnesty International. Even a State Department spokesperson, Philip Crowley, criticized the treatment as "ridiculous, counterproductive, and stupid," and resigned. She was moved to a medium-security facility in April 2011. 

The government’s charges against Manning were outrageous, but in 2013 she was convicted of 19 of 22 counts as a result of her whistleblowing activities. She became one of fewerthan a dozen people prosecuted for espionage in the entire history of the United States, and she was sentenced to the longest punishment ever imposed on a whistleblower. Then, the day after her conviction, isolated from her community and in all likelihood expecting to remain in prison for years if not decades, she courageously issued a statement identifying herself as a trans woman, which she’d wanted to reveal for years. 

Over the next several years, while imprisoned, she became an advocate both for government transparency and for transgender rights. Her conviction and sentence pointed to the need for legal reform of both the Computer Fraud and Abuse Act (CFAA) and the Espionage Act.  EFF filed an amicus brief to the U.S. Army Court of Criminal Appeals arguing that the CFAA was never meant to criminalize violations of private policies like those of government systems, and EFF also pushed, and continues to fight for, narrower interpretations of the Espionage Act and stronger protections for whistleblowers, particularly to take into account both the motivation of individuals who pass on documents and the disclosure’s ramifications. 

Even after President Obama commuted her sentence in 2017, and EFF celebrated her work and her release with an EFF award in September, 2017, her fight wasn’t over. She was imprisoned again twice in 2019 and ultimately fined $256,000 for refusing to testify before grand juries investigating WikiLeaks founder Julian Assange. The U.N. Special Rapporteur on torture again criticized Manning’s treatment, writing that "the practice of coercive detention appears to be incompatible with the international human rights obligations of the United States." 

Manning was released in 2020 after having spent almost a decade in total imprisoned for her courage. She wrote a memoir, README.txt, in 2022, to take back control over her story.

EFF Award Winners Mike Masnick, Annie Game, and Chelsea Manning

Annie Game (EFF Award Winner, 2017)

Annie Game spent over 16 years as the Executive Director of IFEX, a global network of journalism and civil liberties organizations working together to defend freedom of expression.  IFEX (formerly International Freedom of Expression Exchange) began in the 1990s, when a group of organizations and the Canadian Committee to Protect Journalists came together to consider how to respond as a single voice to free-expression violations around the world. IFEX now is a global hub for the protection of free speech and journalism. 

Game recognized early on that digital rights and freedom of expression groups needed one another. Under her leadership, IFEX paired more traditional free-expression organizations with their more digital counterparts, with a focus on building organizational security capacities. IFEX Initiatives under Game’s leadership have been expansive. For example, the International Day to End Impunity for Crimes against Journalists, November 2, has been an annual wake-up call and reminder for UN member states to live up to their commitments to protecting journalists. UNESCO observed more than 1,700 journalists were killed globally between 2006 and 2024, and nearly 90% of these cases went unsolved in the courts. 

Game and IFEX have also focused on high-profile cases of journalists threatened by governments for their work, such as Bahey eldin Hassan in Egypt. Bahey is the director of the Cairo Institute for Human Rights Studies (CIHRS) and has advocated for freedom of expression and the basic human rights of Egyptians, but has lived in exile since 2014. The charges against him, of “disseminating false information” and “insulting the judiciary,” are common tactics of intimidation and harassment. Bahey’s supposed crimes were sharing social media posts criticising the Egyptian judiciary’s lack of independence, and speaking about the killing in Egypt of Italian researcher Giulio Regeni. Bahey—an IFEX member—is just one of many reporters and human rights workers in danger when they speak. But when journalists and those defending their rights online speak out as one voice, as IFEX helps them do, it makes a difference. 

Another initiative has been the Faces of Free Expression project, a partnership between IFEX and the International Free Expression Project. If you’re looking for more heroes, this project details the stories of “risk-takers and change-makers – individuals who put their careers, their freedom, their safety, and sometimes even their lives on the line,” while reporting, or defending free expression and the right to information. 

Wherever authoritarianism and repression of speech have been on the rise, Game has unapologetically called out injustices and made it safer for journalists to do their work, while ensuring accountability when crimes are committed. The work is more critical now than ever, and since leaving IFEX in 2022, she’s remained an activist while focusing increasingly on environmental protection. 

Twelve More Heroes 

EFF has honored many more women with awards over the years—from Anita Borg and Hedy Lamarr to Amy Goodman and Beth Givens. This blog from 2012 looks back and acknowledges the important contributions from twelve more EFF Award winners. 

We’ve also asked five women at EFF about women in digital rights, freedom of expression, technology, and tech activism who have inspired us. You can read that here.

Donate to Support EFF's Work

Your donations empower EFF to do even more.

Admiring Our Heroes for International Women’s Day: Celebrating Women Who Have Received EFF Awards 

Fri, 03/06/2026 - 7:57pm

For the last hundred years, women have had pivotal and far too often unsung roles in building and shaping the technology that we now use every day. Many have heard of Ada Lovelace’s contributions to computer programming, but far fewer know Mary Allen Wilkes, a prominent modern programmer who wrote much of the software for the LINC, one of the world’s first interactive personal computers (it could fit in a single office and cost $40,000, but it was the 60’s). Decades earlier, when the first all-electronic, digital Eniac computer was built in the 40’s, the “software” for it was written by women: Kathleen McNulty, Jean Jennings, Betty Snyder, Marlyn Wescoff, Frances Bilas and Ruth Lichterman. 

It’s thankfully become more common knowledge that actor and inventor Hedy Lamarr co-created the concept of "frequency-hopping" that became a basis for radio systems from cell phones to wireless networking systems. But too few know Laila Ohlgren, who in the 1970’s solved a major problem with the development of mobile networks and phones by recognizing that dialed numbers could be stored and sent all at once with a “call button,” rather than sent one number at a time, which created connection issues before a call was even made. 

Women in tech deserve more and brighter spotlights. At EFF, we’ve had the honor of celebrating some of our heroes at our annual EFF Awards, including many women who are leading the digital rights community. For International Women’s Day, we’re highlighting the contributions of just a few of these recipients from the last decade, whose work to protect privacy, speech, and creativity online has had a global impact.

Carolina Botero (EFF Award Winner, 2024) 

Carolina Botero is a leader in the fight for digital rights in Latin America. For over a decade, she led the Colombia-based Karisma Foundation and cultivated its regional and international impact. Botero and Karisma helped connect indigenous peoples to the internet and made it possible to contribute content to Wikipedia in their native language, expanding access to both history and modern information. They built alliances to combat disinformation, pushed for legal tools to protect cultural and heritage institutions from digital blackholes, and were, and remain, a necessary voice speaking for human rights in the online world. EFF worked closely with Karisma and Botero to help free Colombian graduate student Diego Gomez, who shared another student’s Master’s thesis with colleagues over the internet. Diego’s story demonstrates what can go wrong when nations enact severe penalties for copyright infringement, and thanks to work from Karisma, many partners, and many EFF supporters, he was cleared of the criminal charges that he faced for this harmless act of sharing scholarly research.

Carolina Botero receiving her EFF Award

Botero stepped down from the role in 2024, opening the door for a new generation. While her work continues—she’s currently on the advisory board of CELE, the Centro de Estudios en Libertad de Expresión—her EFF Award was well-deserved based on her strong and inspiring legacy for those in Latin America and beyond who advocate for a digital world that enhances rights and empowers the powerless. Learn more about Botero on her EFF Awards page and the recap of the 2024 event

Chelsea Manning (EFF Award Winner, 2017)

Chelsea Manning became famous as a whistleblower: In 2010, she disclosed classified Iraq War documents, including a video of the killings of Iraqi civilians and two Reuters reporters by U.S. troops. These documents exposed aspects of U.S. operations in Iraq and Afghanistan that infuriated the public and embarrassed the government. But she is also a transparency and transgender rights advocate, network security expert, author, and former U.S. Army intelligence analyst. 

Manning joined the military in 2007. Her role as an intelligence analyst to an Army unit in Iraq in 2009 gave her access to classified databases, but more importantly, it gave her a uniquely comprehensive view of the war in Iraq, and she became increasingly disillusioned and frustrated by what she saw, versus what was being shared. In 2010, she approached major news outlets hoping to give information to them that would reveal a new side of the war to the public. Ultimately, she shared the documents with Wikileaks. 

Manning’s bravery did not end there. When she was arrested a few months later, she endured "cruel, inhuman and degrading" treatment, according to the UN Special Rapporteur on torture. She was locked up alone for 23 hours a day over an 11-month period, before her trial. The mistreatment resulted in public outcry and advocacy by organizations like Amnesty International. Even a State Department spokesperson, Philip Crowley, criticized the treatment as "ridiculous, counterproductive, and stupid," and resigned. She was moved to a medium-security facility in April 2011. 

The government’s charges against Manning were outrageous, but in 2013 she was convicted of 19 of 22 counts as a result of her whistleblowing activities. She became one of fewerthan a dozen people prosecuted for espionage in the entire history of the United States, and she was sentenced to the longest punishment ever imposed on a whistleblower. Then, the day after her conviction, isolated from her community and in all likelihood expecting to remain in prison for years if not decades, she courageously issued a statement identifying herself as a trans woman, which she’d wanted to reveal for years. 

Over the next several years, while imprisoned, she became an advocate both for government transparency and for transgender rights. Her conviction and sentence pointed to the need for legal reform of both the Computer Fraud and Abuse Act (CFAA) and the Espionage Act.  EFF filed an amicus brief to the U.S. Army Court of Criminal Appeals arguing that the CFAA was never meant to criminalize violations of private policies like those of government systems, and EFF also pushed, and continues to fight for, narrower interpretations of the Espionage Act and stronger protections for whistleblowers, particularly to take into account both the motivation of individuals who pass on documents and the disclosure’s ramifications. 

Even after President Obama commuted her sentence in 2017, and EFF celebrated her work and her release with an EFF award in September, 2017, her fight wasn’t over. She was imprisoned again twice in 2019 and ultimately fined $256,000 for refusing to testify before grand juries investigating WikiLeaks founder Julian Assange. The U.N. Special Rapporteur on torture again criticized Manning’s treatment, writing that "the practice of coercive detention appears to be incompatible with the international human rights obligations of the United States." 

Manning was released in 2020 after having spent almost a decade in total imprisoned for her courage. She wrote a memoir, README.txt, in 2022, to take back control over her story.

EFF Award Winners Mike Masnick, Annie Game, and Chelsea Manning

Annie Game (EFF Award Winner, 2017)

Annie Game spent over 16 years as the Executive Director of IFEX, a global network of journalism and civil liberties organizations working together to defend freedom of expression.  IFEX (formerly International Freedom of Expression Exchange) began in the 1990s, when a group of organizations and the Canadian Committee to Protect Journalists came together to consider how to respond as a single voice to free-expression violations around the world. IFEX now is a global hub for the protection of free speech and journalism. 

Game recognized early on that digital rights and freedom of expression groups needed one another. Under her leadership, IFEX paired more traditional free-expression organizations with their more digital counterparts, with a focus on building organizational security capacities. IFEX Initiatives under Game’s leadership have been expansive. For example, the International Day to End Impunity for Crimes against Journalists, November 2, has been an annual wake-up call and reminder for UN member states to live up to their commitments to protecting journalists. UNESCO observed more than 1,700 journalists were killed globally between 2006 and 2024, and nearly 90% of these cases went unsolved in the courts. 

Game and IFEX have also focused on high-profile cases of journalists threatened by governments for their work, such as Bahey eldin Hassan in Egypt. Bahey is the director of the Cairo Institute for Human Rights Studies (CIHRS) and has advocated for freedom of expression and the basic human rights of Egyptians, but has lived in exile since 2014. The charges against him, of “disseminating false information” and “insulting the judiciary,” are common tactics of intimidation and harassment. Bahey’s supposed crimes were sharing social media posts criticising the Egyptian judiciary’s lack of independence, and speaking about the killing in Egypt of Italian researcher Giulio Regeni. Bahey—an IFEX member—is just one of many reporters and human rights workers in danger when they speak. But when journalists and those defending their rights online speak out as one voice, as IFEX helps them do, it makes a difference. 

Another initiative has been the Faces of Free Expression project, a partnership between IFEX and the International Free Expression Project. If you’re looking for more heroes, this project details the stories of “risk-takers and change-makers – individuals who put their careers, their freedom, their safety, and sometimes even their lives on the line,” while reporting, or defending free expression and the right to information. 

Wherever authoritarianism and repression of speech have been on the rise, Game has unapologetically called out injustices and made it safer for journalists to do their work, while ensuring accountability when crimes are committed. The work is more critical now than ever, and since leaving IFEX in 2022, she’s remained an activist while focusing increasingly on environmental protection. 

Twelve More Heroes 

EFF has honored many more women with awards over the years—from Anita Borg and Hedy Lamarr to Amy Goodman and Beth Givens. This blog from 2012 looks back and acknowledges the important contributions from twelve more EFF Award winners. 

We’ve also asked five women at EFF about women in digital rights, freedom of expression, technology, and tech activism who have inspired us. You can read that here.

Donate to Support EFF's Work

Your donations empower EFF to do even more.

Admiring Our Heroes for International Women’s Day: Five Women In Tech That EFF Admires

Fri, 03/06/2026 - 5:48pm

In honor of International Women’s Day, we asked five women at EFF about women in digital rights, freedom of expression, technology, and tech activism who have inspired us.  

Anna Politkovskaya 

Jillian York, Activist 
This International Women’s Day, I want to honor the memory of Anna Politkovskaya, the Russian investigative journalist who relentlessly exposed political and social abuses, endured harassment and violence for her work, and was ultimately killed for telling the truth. I had just started my career when I learned of her death, and it forced me to confront that freedom of expression isn’t an abstract principle but rather something people risk—and sometimes lose—their lives for. 

Her story reminds me that journalism at its best is an act of moral courage, not just a profession. In the face of threats, poison, and relentless pressure to stay silent, she chose to continue writing about what she saw, insisting that ordinary people’s lives were worth the world’s attention. She refused to compromise with power, even when she knew it could cost her life. To me, defending freedom of expression means defending those like Anna who bear witness to injustice, prioritize truth, and hold power to account for those whose voices are silenced.  

Cindy Cohn 

Corynne McSherry, Legal Director 
There are so many women who have shaped tech history–most of whom are still unsung heroes—that it’s hard to single out just one. But it’s easier this year because it’s a chance to celebrate my boss, Cindy Cohn, before she leaves EFF for her next adventure.  

Cindy has been fighting for our digital rights for 30 years. leading EFF’s legal work and eventually the whole organization. She helped courts understand that code is speech deserving of constitutional protections at a time when many judges weren’t entirely sure what code even was. She led the fight against NSA spying, and even though outdated and ill-fitting doctrines like the state secrets privilege prevented courts from ruling on the obvious unconstitutionality of the NSA’s mass surveillance program, the fight itself led to real reforms that have expanded over time.   

I’ve worked closely with her for much of her EFF career, starting in 2005 when we sued Sony for installing spyware in millions of computers, and I’ve seen firsthand her work as a visionary lawyer, outstanding writer, and tireless champion for user privacy, free expression, and innovation. She’s also warm and funny, with the biggest heart in the world, and I’m proud to call her a friend as well as a mentor.  

Jane

Sarah Hamid, Activist 
When talking about women in tech, we usually mean founders, engineers, and executives. But just as important are the women who quietly built the practices that underpin today’s movement security culture. 

For as long as social movements have organized in the shadow of state surveillance, women have been designing the protocols, mutual aid networks, and information flows that keep people alive. Those threats feel ever-escalating: fusion‑center monitoring of protests, federal agencies infiltrating and subpoenaing encrypted Signal and social media chats, prosecutors mining search histories.  

In the late 1960s and early 1970s, the underground Jane abortion counseling service—formally the Abortion Counseling Service of Women’s Liberation—built what we would now recognize as a feminist infosec project for abortion access. Jane connected an estimated 11,000 people with safer abortions before Roe v. Wade, using a single public phone number—Call Jane—paired with code names, compartmentalized roles, and minimal records so no one person held the full story of who needed care, who was providing it, and where. When Chicago police raided the collective in 1972, members destroyed their index‑card files rather than let them become a ready‑made map of patients and helpers—an analog secure‑deletion choice that should feel familiar to anyone who has ever wiped a phone or locked down a shared drive. 

The lesson we should take from Jane is a set of principles that still hold in our encrypted‑but‑insecure present: Collect less, separate what you do collect, and be ready to burn the file box. When a search query, a location ping, or a solidarity post can become evidence, treating information as both lifeline and liability is not paranoia—it is care work.  

Ebele Okobi

Babette Ngene, Director of Public Interest Technology 
In the winter of 2013, I had just landed my first job at the intersection of tech and human rights, working for a prominent nonprofit and I was encouraged to attend regular tech and policy events around town. One such event on internet governance was happening at George Washington Universit,  focusing on multistakeholder engagement on internet policy and governance issues, with companies, nonprofits, and government representatives in attendance. I was inexperienced with these topics, and I’ll admit I was a bit intimidated. 

Then I saw her. She was the only woman on the opening panel, an African woman, an accomplished woman. Not only was she a respected lawyer at Yahoo at the time, but her impressive background, presence, and confident speaking style immediately inspired me. She made me feel like I, too, belonged in that room and could become a powerful voice. 

Ebele Okobi would go on to become one of the most powerful and respected voices in the tech and human rights space, known for her advocacy for digital rights and responsible innovation across Africa and the broader global majority during her tenure at Facebook. Beyond her corporate advocacy, Ebele has consistently championed ethical technology and social justice. She embodies the leadership qualities I value most: empathy, speaking truth to power, integrity, and authenticity. 

I remain in the tech and human rights space because I saw her, because seeing her made me feel seen. Representation truly does matter.  

Ada Lovelace 

Allison Morris, Chief Development Director 
I’m not a lawyer, activist, or technologist; I’m a fundraiser and a lover of stories. And what storyteller at EFF couldn’t help but love Ada Lovelace? The daughter of Lord Byron – the human embodiment of Romanticism – Ada was an innovator in math and science and, ultimately, the writer of the first computer program.  

Lovelace saw the potential in Charles Babbage’s theoretical General Purpose Computer (which was never actually built) and created the foundations of modern computing long before the digital age. In creating the first computer code, Lovelace took Babbage’s concept of a machine that could perform mathematical calculations and realized that it could manipulate symbols as well as numbers. 

Given the expectations of women in her time and the controversy of what work should be attributed to Lovelace as opposed to the man she often worked with, I can’t help but be inspired by her story.  

Donate to Support EFF's Work

Your donations empower EFF to do even more.

Women in tech deserve more and brighter spotlights. At EFF, we’ve had the honor of celebrating some of our heroes at our annual EFF Awards, including many women who are leading the digital rights community. For International Women’s Day, we also highlighted the contributions of just a few of these recipients from the last decade, whose work to protect privacy, speech, and creativity online has had a global impact.

Admiring Our Heroes for International Women’s Day: Five Women In Tech That EFF Admires

Fri, 03/06/2026 - 5:48pm

In honor of International Women’s Day, we asked five women at EFF about women in digital rights, freedom of expression, technology, and tech activism who have inspired us.  

Anna Politkovskaya 

Jillian York, Activist 
This International Women’s Day, I want to honor the memory of Anna Politkovskaya, the Russian investigative journalist who relentlessly exposed political and social abuses, endured harassment and violence for her work, and was ultimately killed for telling the truth. I had just started my career when I learned of her death, and it forced me to confront that freedom of expression isn’t an abstract principle but rather something people risk—and sometimes lose—their lives for. 

Her story reminds me that journalism at its best is an act of moral courage, not just a profession. In the face of threats, poison, and relentless pressure to stay silent, she chose to continue writing about what she saw, insisting that ordinary people’s lives were worth the world’s attention. She refused to compromise with power, even when she knew it could cost her life. To me, defending freedom of expression means defending those like Anna who bear witness to injustice, prioritize truth, and hold power to account for those whose voices are silenced.  

Cindy Cohn 

Corynne McSherry, Legal Director 
There are so many women who have shaped tech history–most of whom are still unsung heroes—that it’s hard to single out just one. But it’s easier this year because it’s a chance to celebrate my boss, Cindy Cohn, before she leaves EFF for her next adventure.  

Cindy has been fighting for our digital rights for 30 years. leading EFF’s legal work and eventually the whole organization. She helped courts understand that code is speech deserving of constitutional protections at a time when many judges weren’t entirely sure what code even was. She led the fight against NSA spying, and even though outdated and ill-fitting doctrines like the state secrets privilege prevented courts from ruling on the obvious unconstitutionality of the NSA’s mass surveillance program, the fight itself led to real reforms that have expanded over time.   

I’ve worked closely with her for much of her EFF career, starting in 2005 when we sued Sony for installing spyware in millions of computers, and I’ve seen firsthand her work as a visionary lawyer, outstanding writer, and tireless champion for user privacy, free expression, and innovation. She’s also warm and funny, with the biggest heart in the world, and I’m proud to call her a friend as well as a mentor.  

Jane

Sarah Hamid, Activist 
When talking about women in tech, we usually mean founders, engineers, and executives. But just as important are the women who quietly built the practices that underpin today’s movement security culture. 

For as long as social movements have organized in the shadow of state surveillance, women have been designing the protocols, mutual aid networks, and information flows that keep people alive. Those threats feel ever-escalating: fusion‑center monitoring of protests, federal agencies infiltrating and subpoenaing encrypted Signal and social media chats, prosecutors mining search histories.  

In the late 1960s and early 1970s, the underground Jane abortion counseling service—formally the Abortion Counseling Service of Women’s Liberation—built what we would now recognize as a feminist infosec project for abortion access. Jane connected an estimated 11,000 people with safer abortions before Roe v. Wade, using a single public phone number—Call Jane—paired with code names, compartmentalized roles, and minimal records so no one person held the full story of who needed care, who was providing it, and where. When Chicago police raided the collective in 1972, members destroyed their index‑card files rather than let them become a ready‑made map of patients and helpers—an analog secure‑deletion choice that should feel familiar to anyone who has ever wiped a phone or locked down a shared drive. 

The lesson we should take from Jane is a set of principles that still hold in our encrypted‑but‑insecure present: Collect less, separate what you do collect, and be ready to burn the file box. When a search query, a location ping, or a solidarity post can become evidence, treating information as both lifeline and liability is not paranoia—it is care work.  

Ebele Okobi

Babette Ngene, Director of Public Interest Technology 
In the winter of 2013, I had just landed my first job at the intersection of tech and human rights, working for a prominent nonprofit and I was encouraged to attend regular tech and policy events around town. One such event on internet governance was happening at George Washington Universit,  focusing on multistakeholder engagement on internet policy and governance issues, with companies, nonprofits, and government representatives in attendance. I was inexperienced with these topics, and I’ll admit I was a bit intimidated. 

Then I saw her. She was the only woman on the opening panel, an African woman, an accomplished woman. Not only was she a respected lawyer at Yahoo at the time, but her impressive background, presence, and confident speaking style immediately inspired me. She made me feel like I, too, belonged in that room and could become a powerful voice. 

Ebele Okobi would go on to become one of the most powerful and respected voices in the tech and human rights space, known for her advocacy for digital rights and responsible innovation across Africa and the broader global majority during her tenure at Facebook. Beyond her corporate advocacy, Ebele has consistently championed ethical technology and social justice. She embodies the leadership qualities I value most: empathy, speaking truth to power, integrity, and authenticity. 

I remain in the tech and human rights space because I saw her, because seeing her made me feel seen. Representation truly does matter.  

Ada Lovelace 

Allison Morris, Chief Development Director 
I’m not a lawyer, activist, or technologist; I’m a fundraiser and a lover of stories. And what storyteller at EFF couldn’t help but love Ada Lovelace? The daughter of Lord Byron – the human embodiment of Romanticism – Ada was an innovator in math and science and, ultimately, the writer of the first computer program.  

Lovelace saw the potential in Charles Babbage’s theoretical General Purpose Computer (which was never actually built) and created the foundations of modern computing long before the digital age. In creating the first computer code, Lovelace took Babbage’s concept of a machine that could perform mathematical calculations and realized that it could manipulate symbols as well as numbers. 

Given the expectations of women in her time and the controversy of what work should be attributed to Lovelace as opposed to the man she often worked with, I can’t help but be inspired by her story.  

Donate to Support EFF's Work

Your donations empower EFF to do even more.

Women in tech deserve more and brighter spotlights. At EFF, we’ve had the honor of celebrating some of our heroes at our annual EFF Awards, including many women who are leading the digital rights community. For International Women’s Day, we also highlighted the contributions of just a few of these recipients from the last decade, whose work to protect privacy, speech, and creativity online has had a global impact.

Pages