Feed aggregator
Neglecting land–atmosphere feedbacks overestimates climate-driven increases in evapotranspiration
Nature Climate Change, Published online: 11 September 2025; doi:10.1038/s41558-025-02428-5
How evapotranspiration changes with warming is not well understood. Here the authors show that when often-neglected land–atmosphere feedbacks are considered, evapotranspiration increases less than currently projected by offline models.EFF to Court: The Supreme Court Must Rein in Expansive Secondary Copyright Liability
If the Supreme Court doesn’t reverse a lower court’s ruling, internet service providers (ISPs) could be forced to terminate people’s internet access based on nothing more than mere accusations of copyright infringement. This would threaten innocent users who rely on broadband for essential aspects of daily life. EFF—along with the American Library Association, the Association of Research Libraries, and Re:Create—filed an amicus brief urging the Court to reverse the decision.
The Stakes: Turning ISPs into Copyright PoliceAmong other things, the Supreme Court approving the appeals court’s findings will radically change the amount of risk your ISP takes on if a customer infringes on copyright, forcing the ISP to terminate access to the internet for those users accused of copyright infringement—and everyone else who uses that internet connection.
This issue turns on what courts call “secondary liability,” which is the legal idea that someone can be held responsible not for what they did directly, but for what someone else did using their product or service.
The case began when music companies sued Cox Communications, arguing that the ISP should be held liable for copyright infringement committed by some of its subscribers. The Court of Appeals for the Fourth Circuit agreed, adopting a “material contribution” standard for contributory copyright liability (a rule for when service providers can be held liable for the actions of users). The lower court said that providing a service that could be used for infringement is enough to create liability when a customer infringes.
In the Patent Act, where Congress has explicitly defined secondary liability, there’s a different test: contributory infringement exists only where a product is incapable of substantial non-infringing use. Internet access, of course, is overwhelmingly used for lawful purposes, making it the very definition of a “staple article of commerce” that can’t be liable under the patent framework. Yet under the Fourth Circuit’s rule, ISPs could face billion-dollar damages if they fail to terminate users on the basis of even flimsy or automated infringement claims.
Our Argument: Apply Clear Rules from the Patent Act, Not Confusing Judge-Made TestsOur brief urges the Court to do what it has done in the past: look to patent law to define the limits of secondary liability in copyright. That means contributory infringement must require more than a “material contribution” by the service provider—it should apply only when a product or service is especially designed for infringement and lacks substantial non-infringing uses.
The Human Cost: Losing Internet Access Hurts EveryoneThe Fourth Circuit’s rule threatens devastating consequences for the public. Terminating an ISP account doesn’t just affect a person accused of unauthorized file sharing—it cuts off entire households, schools, libraries, or businesses that share an internet connection.
- Public libraries, which provide internet access to millions of Americans who lack it at home, could lose essential service.
- Universities, hospitals, and local governments could see internet access for whole communities disrupted.
- Households—especially in low-income and communities of color, which disproportionately share broadband connections with other people—would face collective punishment for the alleged actions of a single user.
With more than a third of Americans having only one or no broadband provider, many users would have no way to reconnect once cut off. And given how essential internet access is for education, employment, healthcare, and civic participation, the consequences of termination are severe and disproportionate.
What’s NextThe Supreme Court has an opportunity to correct course. We’re asking the Court to reject the Fourth Circuit’s unfounded “material contribution” test, reaffirm that patent law provides the right framework for secondary liability, and make clear that the Constitution requires copyright to serve the public good. The Court should ensure that copyright enforcement doesn’t jeopardize the internet access on which participation in modern life depends.
We’ll be watching closely as the Court considers this case. In the meantime, you can read our amicus brief here.
MIT software tool turns everyday objects into animated, eye-catching displays
Whether you’re an artist, advertising specialist, or just looking to spruce up your home, turning everyday objects into dynamic displays is a great way to make them more visually engaging. For example, you could turn a kids’ book into a handheld cartoon of sorts, making the reading experience more immersive and memorable for a child.
But now, thanks to MIT researchers, it’s also possible to make dynamic displays without using electronics, using barrier-grid animations (or scanimations), which use printed materials instead. This visual trick involves sliding a patterned sheet across an image to create the illusion of a moving image. The secret of barrier-grid animations lies in its name: An overlay called a barrier (or grid) often resembling a picket fence moves across, rotates around, or tilts toward an image to reveal frames in an animated sequence. That underlying picture is a combination of each still, sliced and interwoven to present a different snapshot depending on the overlay’s position.
While tools exist to help artists create barrier-grid animations, they’re typically used to create barrier patterns that have straight lines. Building off of previous work in creating images that appear to move, researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a tool that allows users to explore more unconventional designs. From zigzags to circular patterns, the team’s “FabObscura” software turns unique concepts into printable scanimations, helping users add dynamic animations to things like pictures, toys, and decor.
MIT Department of Electrical Engineering and Computer Science (EECS) PhD student and CSAIL researcher Ticha Sethapakdi SM ’19, a lead author on a paper presenting FabObscura, says that the system is a one-size-fits-all tool for customizing barrier-grid animations. This versatility extends to unconventional, elaborate overlay designs, like pointed, angled lines to animate a picture you might put on your desk, or the swirling, hypnotic appearance of a radial pattern you could spin over an image placed on a coin or a Frisbee.
“Our system can turn a seemingly static, abstract image into an attention-catching animation,” says Sethapakdi. “The tool lowers the barrier to entry to creating these barrier-grid animations, while helping users express a variety of designs that would’ve been very time-consuming to explore by hand.”
Behind these novel scanimations is a key finding: Barrier patterns can be expressed as any continuous mathematical function — not just straight lines. Users can type these equations into a text box within the FabObscura program, and then see how it graphs out the shape and movement of a barrier pattern. If you wanted a traditional horizontal pattern, you’d enter in a constant function, where the output is the same no matter the input, much like drawing a straight line across a graph. For a wavy design, you’d use a sine function, which is smooth and resembles a mountain range when plotted out. The system’s interface includes helpful examples of these equations to guide users toward their preferred pattern.
A simple interface for elaborate ideas
FabObscura works for all known types of barrier-grid animations, supporting a variety of user interactions. The system enables the creation of a display with an appearance that changes depending on your viewpoint. FabObscura also allows you to create displays that you can animate by sliding or rotating a barrier over an image.
To produce these designs, users can upload a folder of frames of an animation (perhaps a few stills of a horse running), or choose from a few preset sequences (like an eye blinking) and specify the angle your barrier will move. After previewing your design, you can fabricate the barrier and picture onto separate transparent sheets (or print the image on paper) using a standard 2D printer, such as an inkjet. Your image can then be placed and secured on flat, handheld items such as picture frames, phones, and books.
You can enter separate equations if you want two sequences on one surface, which the researchers call “nested animations.” Depending on how you move the barrier, you’ll see a different story being told. For example, CSAIL researchers created a car that rotates when you move its sheet vertically, but transforms into a spinning motorcycle when you slide the grid horizontally.
These customizations lead to unique household items, too. The researchers designed an interactive coaster that you can switch from displaying a “coffee” icon to symbols of a martini and a glass of water by pressing your fingers down on the edges of its surface. The team also spruced up a jar of sunflower seeds, producing a flower animation on the lid that blooms when twisted off.
Artists, including graphic designers and printmakers, could also use this tool to make dynamic pieces without needing to connect any wires. The tool saves them crucial time to explore creative, low-power designs, such as a clock with a mouse that runs along as it ticks. FabObscura could produce animated food packaging, or even reconfigurable signage for places like construction sites or stores that notify people when a particular area is closed or a machine isn’t working.
Keep it crisp
FabObscura’s barrier-grid creations do come with certain trade-offs. While nested animations are novel and more dynamic than a single-layer scanimation, their visual quality isn’t as strong. The researchers wrote design guidelines to address these challenges, recommending users upload fewer frames for nested animations to keep the interlaced image simple and stick to high-contrast images for a crisper presentation.
In the future, the researchers intend to expand what users can upload to FabObscura, like being able to drop in a video file that the program can then select the best frames from. This would lead to even more expressive barrier-grid animations.
FabObscura might also step into a new dimension: 3D. While the system is currently optimized for flat, handheld surfaces, CSAIL researchers are considering implementing their work into larger, more complex objects, possibly using 3D printers to fabricate even more elaborate illusions.
Sethapakdi wrote the paper with several CSAIL affiliates: Zhejiang University PhD student and visiting researcher Mingming Li; MIT EECS PhD student Maxine Perroni-Scharf; MIT postdoc Jiaji Li; MIT associate professors Arvind Satyanarayan and Justin Solomon; and senior author and MIT Associate Professor Stefanie Mueller, leader of the Human-Computer Interaction (HCI) Engineering Group at CSAIL. Their work will be presented at the ACM Symposium on User Interface Software and Technology (UIST) this month.
Demo Day features hormone-tracking sensors, desalination systems, and other innovations
Kresge Auditorium came alive Friday as MIT entrepreneurs took center stage to share their progress in the delta v startup accelerator program.
Now in its 14th year, delta v Demo Day represents the culmination of a summer in which students work full-time on new ventures under the guidance of the Martin Trust Center for MIT Entrepreneurship.
It also doubles as a celebration, with Trust Center Managing Director (and consummate hype man) Bill Aulet setting the tone early with his patented high-five run through the audience and leap on stage for opening remarks.
“All these students have performed a miracle,” Aulet told the crowd. “One year ago, they were sitting in the audience like all of you. One year ago, they probably didn’t even have an idea or a technology. Maybe they did, but they didn’t have a team, a clear vision, customer models, or a clear path to impact. But today they’re going to blow your mind. They have products — real products — a founding team, a clear mission, customer commitments or letters of intent, legitimate business models, and a path to greatness and impact. In short, they will have achieved escape velocity.”
The two-hour event filled Kresge Auditorium, with a line out the door for good measure, and was followed by a party under a tent on the Kresge lawn. Each presentation began with a short video introducing the company before a student took the stage to expand on the problem they were solving and what their team has learned from talks with potential customers.
In total, 22 startups showcased their ventures and early business milestones in rapid-fire presentations.
Rick Locke, the new dean of the MIT Sloan School of Management, said events like Demo Day are why he came back to the Institute after serving in various roles between 1988 and 2013.
“What’s great about this event is how it crystallizes the spirit of MIT: smart people doing important work, doing it by rolling up their sleeves, doing it with a certain humility but also a vision, and really making a difference in the world,” Locke told the audience. “You can feel the positivity, the energy, and the buzz here tonight. That’s what the world needs more of.”
A program with a purpose
This year’s Demo Day featured 70 students from across MIT, with 16 startups working out of the Trust Center on campus and six working from New York City. Through the delta v program, the students were guided by mentors, received funding, and worked through an action-oriented curriculum full-time between June and September. Aulet also noted that the students presenting benefitted from entrepreneurial support resources from across the Institute.
The odds are in the startups’ favor: A 2022 study found that 69 percent of businesses from the program were still operating five years later. Alumni companies had raised roughly $1 billion in funding.
Demo Day marks the end of delta v and serves to inspire next year’s cohort of entrepreneurs.
“Turn on a screen or look anywhere around you, and you'll see issues with climate, sustainability, health care, the future of work, economic disparities, and more,” Aulet said. “It can all be overwhelming. These entrepreneurs bring light to dark times. Entrepreneurs don’t see problems. As the great Biggie Smalls from Brooklyn said, ‘Turn a negative into a positive.’ That’s what entrepreneurs do.”
Startups in action
Startups in this year’s cohort presented solutions in biotech and health care, sustainability, financial services, energy, and more.
One company, Gees, is helping women with hormonal conditions like polycystic ovary syndrome (PCOS) with a saliva-based sensor that tracks key hormones to help women get personalized insights and manage symptoms.
“Over 200 million women live with PCOS worldwide,” said MIT postdoc and co-founder Walaa Khushaim. “If it goes unmanaged, it can lead to even more serious diseases. The good news is that 80 percent of cases can be managed with lifestyle changes. The problem is women trying to change their lifestyle are left in the dark, unsure if what they are doing is truly helping.”
Gees’ sensor is noninvasive and easier to use than current sensors that track hormones. It provides feedback in minutes from the comfort of users’ homes. The sensor connects to an app that shows results and trends to help women stay on track. The company already has more than 500 sign-ups for its wait list.
Another company, Kira, has created an electrochemical system to increase the efficiency and access of water desalination. The company is aiming to help companies manage their brine wastewater that is often dumped, pumped underground, or trucked off to be treated.
“At Kira, we’re working toward a system that produces zero liquid waste and only solid salts,” says PhD student Jonathan Bessette SM ’22.
Kira says its system increases the amount of clean water created by industrial processes, reduces the amount of brine wastewater, and optimizes the energy flows of factories. The company says next year it will deploy a system at the largest groundwater desalination plant in the U.S.
A variety of other startups presented at the event:
AutoAce builds AI agents for car dealerships, automating repetitive tasks with a 24/7 voice agent that answers inbound service calls and books appointments.
Carbion uses a thermochemical process to convert biomass into battery-grade graphite at half the temperature of traditional synthetic methods.
Clima Technologies has developed an AI building engineer that enables facilities managers to “talk” to their buildings in real-time, allowing teams to conduct 24/7 commissioning, act on fault diagnostics, minimize equipment downtime, and optimize controls.
Cognify uses AI to predict customer interactions with digital platforms, simulating customer behavior to deliver insights into which designs resonate with customers, where friction exists in user journeys, and how to build a user experience that converts.
Durability uses computer vision and AI to analyze movement, predict injury risks, and guide recovery for athletes.
EggPlan uses a simple blood test and proprietary model to assess eligibility for egg freezing with fertility clinics. If users do not have a baby, their fees are returned, making the process risk-free.
Forma Systems developed an optimization software for manufacturers to make smarter, faster decisions about things like materials use while reducing their climate impact.
Ground3d is a social impact organization building a digital tool for crowdsourcing hyperlocal environmental data, beginning with street-level documentation of flooding events in New York City. The platform could help residents with climate resilience and advocacy.
GrowthFactor helps retailers scale their footprint with a fractional real estate analyst while using an AI-powered platform to maximize their chance of commercial success.
Kyma uses AI-powered patient engagement to integrate data from wearables, smart scales, sensors, and continuous glucose monitors to track behaviors and draft physician-approved, timely reminders.
LNK Energies is solving the heavy-duty transport industry’s emissions problem with liquid organic hydrogen carriers (LOHCs): safe, room-temperature liquids compatible with existing diesel infrastructure.
Mendhai Health offers a suite of digital tools to help women improve pelvic health and rehabilitate before and after childbirth.
Nami has developed an automatic, reusable drinkware cleaning station that delivers a hot, soapy, pressurized wash in under 30 seconds.
Pancho helps restaurants improve margins with an AI-powered food procurement platform that uses real-time price comparison, dispute tracking, and smart ordering.
Qadence offers older adults a co-pilot that assesses mobility and fall risk, then delivers tailored guidance to improve balance, track progress, and extend recovery beyond the clinic.
Sensopore offers an at-home diagnostic device to help families test for everyday illnesses at home, get connected with a telehealth doctor, and have prescriptions shipped to their door, reducing clinical visits.
Spheric Bio has developed a personal occlusion device to improve a common surgical procedure used to treat strokes.
Tapestry uses conversational AI to chat with attendees before events and connect them with the right people for more meaningful conversations.
Torque automates financial analysis across private equity portfolios to help investment professionals make better strategic decisions.
Trazo helps interior designers and architects collaborate and iterate on technical drawings and 3D designs of new construction of remodeling projects.
San Francisco Gets An Invasive Billionaire-Bought Surveillance HQ
San Francisco billionaire Chris Larsen once again has wielded his wallet to keep city residents under the eye of all-seeing police surveillance.
The San Francisco Police Commission, the Board of Supervisors, and Mayor Daniel Lurie have signed off on Larsen’s $9.4 million gift of a new Real-Time Investigations Center. The plan involves moving the city’s existing police tech hub from the public Hall of Justice not to the city’s brand-new police headquarters but instead to a sublet in the Financial District building of Ripple Labs, Larsen’s crypto-transfer company. Although the city reportedly won’t be paying for the space, the lease reportedly cost Ripple $2.3 million and will last until December 2026.
The deal will also include a $7.25 million gift from the San Francisco Police Community Foundation that Larsen created. Police foundations are semi-public fundraising arms of police departments that allow them to buy technology and gear that the city will not give them money for.
In Los Angeles, the city’s police foundation got $178,000 from the company Target to pay for the services of the data analytics company Palantir to use for predictive policing. In Atlanta, the city’s police foundation funds a massive surveillance apparatus as well as the much-maligned Cop City training complex. (Despite police foundations’ insistence that they are not public entities and therefore do not need to be transparent or answer public records requests, a judge recently ordered the Atlanta Police Foundation to release documentation related to Cop City.)
A police foundation in San Francisco brings the same concerns: that an unaccountable and untransparent fundraising arm shmoozing with corporations and billionaires would fund unpopular surveillance measures without having to reveal much to the public.
Larsen was one of the deep pockets behind last year’s Proposition E, a ballot measure to supercharge surveillance in the city. The measure usurped the city’s 2019 surveillance transparency and accountability ordinance, which had required the SFPD to get the elected Board of Supervisors’ approval before buying and using new surveillance technology. This common-sense democratic hurdle was, apparently, a bridge too far for the SFPD and for Larsen.
We’re no fans of real-time crime centers (RTCCs), as they’re often called elsewhere, to start with. They’re basically control rooms that pull together all feeds from a vast warrantless digital dragnet, often including automated license plate readers, fixed cameras, officers’ body-worn cameras, drones, and other sources. It’s a means of consolidating constant surveillance of the entire population, tracking everyone wherever they go and whatever they do – worrisome at any time, but especially in a time of rising authoritarianism.
Think of what this data could do if it got into federal hands; imagine how vulnerable city residents would be subject to harassment if every move they made was centralized and recorded downtown. But you don’t have to imagine, because SFPD already has been caught sharing automated license plate reader data with out-of-state law enforcement agencies assisting in federal immigration investigations.
We’re especially opposed to RTCCs using live feeds from non-city surveillance cameras to push that panopticon’s boundaries even wider, as San Francisco’s does. Those semi-private networks of some 15,000 cameras, already abused by SFPD to surveil lawful protests against police violence, were funded in part by – you guessed it – Chris Larsen.
These technologies could potentially endanger San Franciscans by directing armed police at them due to reliance on a faulty algorithm or by putting already-marginalized communities at further risk of overpolicing and surveillance. But studies find that these technologies just don’t work. If the goal is to stop crime before it happens, to spare someone the hardship and the trauma of getting robbed or hurt, cameras clearly do not accomplish this. There’s plenty of footage of crime occurring that belies the idea that surveillance is an effective deterrent, and although police often look to technology as a silver bullet to fight crime, evidence suggests that it does little to alter the historic ebbs and flows of criminal activity.
Yet now this unelected billionaire – who already helped gut police accountability and transparency rules and helped fund sketchy surveillance of people exercising their First Amendment rights – wants to bankroll, expand, and host the police’s tech nerve center.
Policing must be a public function so that residents can control - and demand accountability and transparency from - those who serve and protect but also surveil and track us all. Being financially beholden to private interests erodes the community’s trust and control and can leave the public high and dry if a billionaire’s whims change or conflict with the will of the people. Chris Larsen could have tried to address the root causes of crime that affect our community; instead, he exercises his bank account's muscle to decide that surveillance is best for San Franciscans with less in their wallets.
Elected officials should have said “thanks but no thanks” to Larsen and ensured that the San Francisco Police Department remained under the complete control and financial auspices of nobody except the people of San Francisco. Rich people should not be allowed to fund the further degradation of our privacy as we go about our lives in our city’s public places. Residents should carefully watch what comes next to decide for themselves whether a false sense of security is worth living under constant, all-seeing, billionaire-bankrolled surveillance.
Rayhunter: What We Have Found So Far
A little over a year ago we released Rayhunter, our open source tool designed to detect cell-site simulators. We’ve been blown away by the level of community engagement on this project. It has been installed on thousands of devices (or so we estimate, we don’t actually know since Rayhunter doesn’t have any telemetry!). We have received dozens of packet captures, hundreds of improvements, both minor and major, documentation fixes, and bug reports from our open source community. This project is a testament to the power and impact of open source and community driven counter-surveillance.
If this is your first time hearing about Rayhunter, you can read our announcement blog post here. Or if you prefer, you can watch our DEF CON talk. In short, Rayhunter is an open source Linux program that runs on a variety of mobile hotspots (dedicated devices that use a cellular connection to give you Wi-Fi). Rayhunter’s job is to look for cell-site simulators (CSS), a tool police use to locate or identify people's cell phones, also known as IMSI catchers or Stingrays. Rayhunter analyzes the “handshakes” between your Rayhunter device and the cell towers it is connected to for behaviors consistent with that of a CSS. When it finds potential evidence of a CSS it alerts the user with an indicator on the screen and potentially a push notification to their phone.
Understanding if CSS are being used to spy on protests is one of the main goals of the Rayhunter project. Thanks to members of our community bringing Rayhunter to dozens of protests, we are starting to get a picture of how CSS are currently being used in the US. So far Rayhunter has not turned up any evidence of cell-site simulators being used to spy on protests in the US — though we have found them in use elsewhere.
So far Rayhunter has not turned up any evidence of cell-site simulators being used to spy on protests in the US.
There are a couple of caveats here. First, it’s often impossible to prove a negative. Maybe Rayhunter just hasn’t been at protests where CSS have been present. Maybe our detection signatures aren’t picking up the techniques used by US law enforcement. But we’ve received reports from a lot of protests, including pro-Palestine protests, protests in Washington DC and Los Angeles, as well as the ‘No Kings’ and ‘50501’ protests all over the country. So far, we haven’t seen evidence of CSS use at any of them.
A big part of the reason for the lack of CSS at protests could be that some courts have required a warrant for their use, and even law enforcement agencies not bound by these rulings have policies that require police to get a warrant. CSS are also costly to buy and use, requiring trained personnel to use nearly one million dollars worth of equipment.
The fact is police also have potentially easier to use tools available. If the goal of using a CSS at a protest is to find out who was at the protest, police could use tools such as:
- License plate readers to track the vehicles arriving and leaving at the protest.
- Location data brokers, such as Locate X and Fog Data Science, to track the phones of protestors by their mobile advertising IDs (MAID).
- Cellebrite and other forensic extraction tools to download all the data from phones of arrested protestors if they are able to unlock those phones.
- Geofence warrants, which require internet companies like Google to disclose the identifiers of devices within a given location at a given time.
- Facial recognition such as Clearview AI to identify all present via public or private databases of peoples faces.
- Tower dumps from phone companies, which, similar to geofence warrants, require phone companies to turn over a list of all the phones connected to a certain tower at a certain time.
We think, due to the lack of evidence of CSS being used, protestors can worry less about CSS and more about these other techniques. Luckily, the actions one should take to protect themselves are largely the same:
- To protect yourself against Locate X and Fog you can turn off location services on your phone (iPhone and Android).
- To protect yourself from Cellebrite you can use a strong password, turn off biometric unlocks, and keep your phone up to date.
- To protect against facial recognition, you can wear a mask.
- To protect against tower dumps put your phone into airplane mode (though especially high risk individuals may want to use a Faraday bag instead).
We feel pretty good about Rayhunter’s detection engine, though there could still be things we are missing. Some of our confidence in Rayhunter’s detection engine comes from the research we have done into how CSS work. But the majority of our confidence comes from testing Rayhunter against a commercial cell-site simulator thanks to our friends at Cape. Rayhunter detected every attack run by the commercial CSS.
Where Rayhunter Has Detected Likely SurveillanceRayhunter users have found potential evidence of CSS being used in the wild, though not at protests. One of the most interesting examples that triggered multiple detections and even inspired us to write some new detection rules was at a cruise port in the Turks and Caicos Islands. The person who captured this data put the packet captures online for other researchers to review.
Rayhunter users have detected likely CSS use in the US as well. We have received reports from Chicago and New York where our “IMSI Sent without authentication” signature was triggered multiple times over the course of a couple hours and then stopped. Neither report was in the vicinity of a protest. We feel fairly confident that these reports are indicative of a CSS being present, though we don’t have any secondary evidence to back them up.
We have received other reports that have triggered our CSS detection signatures, but the above examples are the ones we feel most confident about.
We encourage people to keep using Rayhunter and continue bringing it to protests. Law enforcement trends can change over time and it is possible that some cities are using them more often than others (for example Fontana, California reportedly used their CSS over 300 times in two years). We also know that ICE still uses CSS and has recently renewed their contracts. Interestingly, in January, the FBI requested a warrant from the Foreign Intelligence Surveillance Court to use what was likely a CSS and was rejected. This was the first time the FBI has sought a warrant to use a CSS using the Foreign Intelligence Surveillance Act since 2015, when the Justice Department began requiring a warrant for their use. If police start using CSS to spy on protests we want to know.
There is still a lot we want to accomplish with Rayhunter, we have some future plans for the project that we are very excited to share with you in the near future, but the biggest thing we need right now is more testing outside of the United States.
Taking Rayhunter InternationalWe are interested in getting Rayhunter data from every country to help us understand the global use of CSS and to refine our signatures. Just because CSS don't appear to be used to spy on protests in the US right now doesn't mean that is true everywhere. We have also seen that some signatures that work in the US are prone to false positives elsewhere (such as our 2G signature in countries that still have active 2G networks). The first device supported by Rayhunter, the Orbic hotspot, was US only, so we have very little international data. But we now have support for multiple devices! If you are interested in Rayhunter, but can’t find a device that works in your country, let us know. We recommend you consult with an attorney in your country to determine whether running Rayhunter is likely to be legally risky or outlawed in your jurisdiction.
Related Cases: Carpenter v. United StatesDOE selects MIT to establish a Center for the Exascale Simulation of Coupled High-Enthalpy Fluid–Solid Interactions
The U.S. Department of Energy’s National Nuclear Security Administration (DOE/NNSA) recently announced that it has selected MIT to establish a new research center dedicated to advancing the predictive simulation of extreme environments, such as those encountered in hypersonic flight and atmospheric re-entry. The center will be part of the fourth phase of NNSA's Predictive Science Academic Alliance Program (PSAAP-IV), which supports frontier research advancing the predictive capabilities of high-performance computing for open science and engineering applications relevant to national security mission spaces.
The Center for the Exascale Simulation of Coupled High-Enthalpy Fluid–Solid Interactions (CHEFSI) — a joint effort of the MIT Center for Computational Science and Engineering, the MIT Schwarzman College of Computing, and the MIT Institute for Soldier Nanotechnologies (ISN) — plans to harness cutting-edge exascale supercomputers and next-generation algorithms to simulate with unprecedented detail how extremely hot, fast-moving gaseous and solid materials interact. The understanding of these extreme environments — characterized by temperatures of more than 1,500 degrees Celsius and speeds as high as Mach 25 — and their effect on vehicles is central to national security, space exploration, and the development of advanced thermal protection systems.
“CHEFSI will capitalize on MIT’s deep strengths in predictive modeling, high-performance computing, and STEM education to help ensure the United States remains at the forefront of scientific and technological innovation,” says Ian A. Waitz, MIT’s vice president for research. “The center’s particular relevance to national security and advanced technologies exemplifies MIT’s commitment to advancing research with broad societal benefit.”
CHEFSI is one of five new Predictive Simulation Centers announced by the NNSA as part of a program expected to provide up to $17.5 million to each center over five years.
CHEFSI’s research aims to couple detailed simulations of high-enthalpy gas flows with models of the chemical, thermal, and mechanical behavior of solid materials, capturing phenomena such as oxidation, nitridation, ablation, and fracture. Advanced computational models — validated by carefully designed experiments — can address the limitations of flight testing by providing critical insights into material performance and failure.
“By integrating high-fidelity physics models with artificial intelligence-based surrogate models, experimental validation, and state-of-the-art exascale computational tools, CHEFSI will help us understand and predict how thermal protection systems perform under some of the harshest conditions encountered in engineering systems,” says Raúl Radovitzky, the Jerome C. Hunsaker Professor of Aeronautics and Astronautics, associate director of the ISN, and director of CHEFSI. “This knowledge will help in the design of resilient systems for applications ranging from reusable spacecraft to hypersonic vehicles.”
Radovitzky will be joined on the center’s leadership team by Youssef Marzouk, the Breene M. Kerr (1951) Professor of Aeronautics and Astronautics, co-director of the MIT Center for Computational Science and Engineering (CCSE), and recently named the associate dean of the MIT Schwarzman College of Computing; and Nicolas Hadjiconstantinou, the Quentin Berg (1937) Professor of Mechanical Engineering and co-director of CCSE, who will serve as associate directors. The center co-principal investigators include MIT faculty members across the departments of Aeronautics and Astronautics, Electrical Engineering and Computer Science, Materials Science and Engineering, Mathematics, and Mechanical Engineering. Franklin Hadley will lead center operations, with administration and finance under the purview of Joshua Freedman. Hadley and Freedman are both members of the ISN headquarters team.
CHEFSI expects to collaborate extensively with the DoE/NNSA national laboratories — Lawrence Livermore National Laboratory, Los Alamos National Laboratory, and Sandia National Laboratories — and, in doing so, offer graduate students and postdocs immersive research experiences and internships at these facilities.
Ten years later, LIGO is a black-hole hunting machine
The following article is adapted from a press release issued by the Laser Interferometer Gravitational-wave Observatory (LIGO) Laboratory. LIGO is funded by the National Science Foundation and operated by Caltech and MIT, which conceived and built the project.
On Sept. 14, 2015, a signal arrived on Earth, carrying information about a pair of remote black holes that had spiraled together and merged. The signal had traveled about 1.3 billion years to reach us at the speed of light — but it was not made of light. It was a different kind of signal: a quivering of space-time called gravitational waves first predicted by Albert Einstein 100 years prior. On that day 10 years ago, the twin detectors of the U.S. National Science Foundation Laser Interferometer Gravitational-wave Observatory (NSF LIGO) made the first-ever direct detection of gravitational waves, whispers in the cosmos that had gone unheard until that moment.
The historic discovery meant that researchers could now sense the universe through three different means. Light waves, such as X-rays, optical, radio, and other wavelengths of light, as well as high-energy particles called cosmic rays and neutrinos, had been captured before, but this was the first time anyone had witnessed a cosmic event through the gravitational warping of space-time. For this achievement, first dreamed up more than 40 years prior, three of the team’s founders won the 2017 Nobel Prize in Physics: MIT’s Rainer Weiss, professor emeritus of physics (who recently passed away at age 92); Caltech’s Barry Barish, the Ronald and Maxine Linde Professor of Physics, Emeritus; and Caltech’s Kip Thorne, the Richard P. Feynman Professor of Theoretical Physics, Emeritus.
Today, LIGO, which consists of detectors in both Hanford, Washington, and Livingston, Louisiana, routinely observes roughly one black hole merger every three days. LIGO now operates in coordination with two international partners, the Virgo gravitational-wave detector in Italy and KAGRA in Japan. Together, the gravitational-wave-hunting network, known as the LVK (LIGO, Virgo, KAGRA), has captured a total of about 300 black hole mergers, some of which are confirmed while others await further analysis. During the network’s current science run, the fourth since the first run in 2015, the LVK has discovered more than 200 candidate black hole mergers, more than double the number caught in the first three runs.
The dramatic rise in the number of LVK discoveries over the past decade is owed to several improvements to their detectors — some of which involve cutting-edge quantum precision engineering. The LVK detectors remain by far the most precise rulers for making measurements ever created by humans. The space-time distortions induced by gravitational waves are incredibly miniscule. For instance, LIGO detects changes in space-time smaller than 1/10,000 the width of a proton. That’s 1/700 trillionth the width of a human hair.
“Rai Weiss proposed the concept of LIGO in 1972, and I thought, ‘This doesn’t have much chance at all of working,’” recalls Thorne, an expert on the theory of black holes. “It took me three years of thinking about it on and off and discussing ideas with Rai and Vladimir Braginsky [a Russian physicist], to be convinced this had a significant possibility of success. The technical difficulty of reducing the unwanted noise that interferes with the desired signal was enormous. We had to invent a whole new technology. NSF was just superb at shepherding this project through technical reviews and hurdles.”
Nergis Mavalvala, the Curtis and Kathleen Marble Professor of Astrophysics at MIT and dean of the MIT School of Science, says that the challenges the team overcame to make the first discovery are still very much at play. “From the exquisite precision of the LIGO detectors to the astrophysical theories of gravitational-wave sources, to the complex data analyses, all these hurdles had to be overcome, and we continue to improve in all of these areas,” Mavalvala says. “As the detectors get better, we hunger for farther, fainter sources. LIGO continues to be a technological marvel.”
The clearest signal yet
LIGO’s improved sensitivity is exemplified in a recent discovery of a black hole merger referred to as GW250114. (The numbers denote the date the gravitational-wave signal arrived at Earth: January 14, 2025.) The event was not that different from LIGO’s first-ever detection (called GW150914) — both involve colliding black holes about 1.3 billion light-years away with masses between 30 to 40 times that of our sun. But thanks to 10 years of technological advances reducing instrumental noise, the GW250114 signal is dramatically clearer.
“We can hear it loud and clear, and that lets us test the fundamental laws of physics,” says LIGO team member Katerina Chatziioannou, Caltech assistant professor of physics and William H. Hurt Scholar, and one of the authors of a new study on GW250114 published in the Physical Review Letters.
By analyzing the frequencies of gravitational waves emitted by the merger, the LVK team provided the best observational evidence captured to date for what is known as the black hole area theorem, an idea put forth by Stephen Hawking in 1971 that says the total surface areas of black holes cannot decrease. When black holes merge, their masses combine, increasing the surface area. But they also lose energy in the form of gravitational waves. Additionally, the merger can cause the combined black hole to increase its spin, which leads to it having a smaller area. The black hole area theorem states that despite these competing factors, the total surface area must grow in size.
Later, Hawking and physicist Jacob Bekenstein concluded that a black hole’s area is proportional to its entropy, or degree of disorder. The findings paved the way for later groundbreaking work in the field of quantum gravity, which attempts to unite two pillars of modern physics: general relativity and quantum physics.
In essence, the LIGO detection allowed the team to “hear” two black holes growing as they merged into one, verifying Hawking’s theorem. (Virgo and KAGRA were offline during this particular observation.) The initial black holes had a total surface area of 240,000 square kilometers (roughly the size of Oregon), while the final area was about 400,000 square kilometers (roughly the size of California) — a clear increase. This is the second test of the black hole area theorem; an initial test was performed in 2021 using data from the first GW150914 signal, but because that data were not as clean, the results had a confidence level of 95 percent compared to 99.999 percent for the new data.
Thorne recalls Hawking phoning him to ask whether LIGO might be able to test his theorem immediately after he learned of the 2015 gravitational-wave detection. Hawking died in 2018 and sadly did not live to see his theory observationally verified. “If Hawking were alive, he would have reveled in seeing the area of the merged black holes increase,” Thorne says.
The trickiest part of this type of analysis had to do with determining the final surface area of the merged black hole. The surface areas of pre-merger black holes can be more readily gleaned as the pair spiral together, roiling space-time and producing gravitational waves. But after the black holes coalesce, the signal is not as clear-cut. During this so-called ringdown phase, the final black hole vibrates like a struck bell.
In the new study, the researchers precisely measured the details of the ringdown phase, which allowed them to calculate the mass and spin of the black hole and, subsequently, determine its surface area. More specifically, they were able, for the first time, to confidently pick out two distinct gravitational-wave modes in the ringdown phase. The modes are like characteristic sounds a bell would make when struck; they have somewhat similar frequencies but die out at different rates, which makes them hard to identify. The improved data for GW250114 meant that the team could extract the modes, demonstrating that the black hole’s ringdown occurred exactly as predicted by math models based on the Teukolsky formalism — devised in 1972 by Saul Teukolsky, now a professor at Caltech and Cornell University.
Another study from the LVK, submitted to Physical Review Letters today, places limits on a predicted third, higher-pitched tone in the GW250114 signal, and performs some of the most stringent tests yet of general relativity’s accuracy in describing merging black holes.
“A decade of improvements allowed us to make this exquisite measurement,” Chatziioannou says. “It took both of our detectors, in Washington and Louisiana, to do this. I don’t know what will happen in 10 more years, but in the first 10 years, we have made tremendous improvements to LIGO’s sensitivity. This not only means we are accelerating the rate at which we discover new black holes, but we are also capturing detailed data that expand the scope of what we know about the fundamental properties of black holes.”
Jenne Driggers, detection lead senior scientist at LIGO Hanford, adds, “It takes a global village to achieve our scientific goals. From our exquisite instruments, to calibrating the data very precisely, vetting and providing assurances about the fidelity of the data quality, searching the data for astrophysical signals, and packaging all that into something that telescopes can read and act upon quickly, there are a lot of specialized tasks that come together to make LIGO the great success that it is.”
Pushing the limits
LIGO and Virgo have also unveiled neutron stars over the past decade. Like black holes, neutron stars form from the explosive deaths of massive stars, but they weigh less and glow with light. Of note, in August 2017, LIGO and Virgo witnessed an epic collision between a pair of neutron stars — a kilonova — that sent gold and other heavy elements flying into space and drew the gaze of dozens of telescopes around the world, which captured light ranging from high-energy gamma rays to low-energy radio waves. The “multi-messenger” astronomy event marked the first time that both light and gravitational waves had been captured in a single cosmic event. Today, the LVK continues to alert the astronomical community to potential neutron star collisions, who then use telescopes to search the skies for signs of kilonovae.
“The LVK has made big strides in recent years to make sure we’re getting high-quality data and alerts out to the public in under a minute, so that astronomers can look for multi-messenger signatures from our gravitational-wave candidates,” Driggers says.
“The global LVK network is essential to gravitational-wave astronomy,” says Gianluca Gemme, Virgo spokesperson and director of research at the National Institute of Nuclear Physics in Italy. “With three or more detectors operating in unison, we can pinpoint cosmic events with greater accuracy, extract richer astrophysical information, and enable rapid alerts for multi-messenger follow-up. Virgo is proud to contribute to this worldwide scientific endeavor.”
Other LVK scientific discoveries include the first detection of collisions between one neutron star and one black hole; asymmetrical mergers, in which one black hole is significantly more massive than its partner black hole; the discovery of the lightest black holes known, challenging the idea that there is a “mass gap” between neutron stars and black holes; and the most massive black hole merger seen yet with a merged mass of 225 solar masses. For reference, the previous record holder for the most massive merger had a combined mass of 140 solar masses.
Even in the decades before LIGO began taking data, scientists were building foundations that made the field of gravitational-wave science possible. Breakthroughs in computer simulations of black hole mergers, for example, allow the team to extract and analyze the feeble gravitational-wave signals generated across the universe.
LIGO’s technological achievements, beginning as far back as the 1980s, include several far-reaching innovations, such as a new way to stabilize lasers using the so-called Pound–Drever–Hall technique. Invented in 1983 and named for contributing physicists Robert Vivian Pound, the late Ronald Drever of Caltech (a founder of LIGO), and John Lewis Hall, this technique is widely used today in other fields, such as the development of atomic clocks and quantum computers. Other innovations include cutting-edge mirror coatings that almost perfectly reflect laser light; “quantum squeezing” tools that enable LIGO to surpass sensitivity limits imposed by quantum physics; and new artificial intelligence methods that could further hush certain types of unwanted noise.
“What we are ultimately doing inside LIGO is protecting quantum information and making sure it doesn’t get destroyed by external factors,” Mavalvala says. “The techniques we are developing are pillars of quantum engineering and have applications across a broad range of devices, such as quantum computers and quantum sensors.”
In the coming years, the scientists and engineers of LVK hope to further fine-tune their machines, expanding their reach deeper and deeper into space. They also plan to use the knowledge they have gained to build another gravitational-wave detector, LIGO India. Having a third LIGO observatory would greatly improve the precision with which the LVK network can localize gravitational-wave sources.
Looking farther into the future, the team is working on a concept for an even larger detector, called Cosmic Explorer, which would have arms 40 kilometers long. (The twin LIGO observatories have 4-kilometer arms.) A European project, called Einstein Telescope, also has plans to build one or two huge underground interferometers with arms more than 10 kilometers long. Observatories on this scale would allow scientists to hear the earliest black hole mergers in the universe.
“Just 10 short years ago, LIGO opened our eyes for the first time to gravitational waves and changed the way humanity sees the cosmos,” says Aamir Ali, a program director in the NSF Division of Physics, which has supported LIGO since its inception. “There’s a whole universe to explore through this completely new lens and these latest discoveries show LIGO is just getting started.”
The LIGO-Virgo-KAGRA Collaboration
LIGO is funded by the U.S. National Science Foundation and operated by Caltech and MIT, which together conceived and built the project. Financial support for the Advanced LIGO project was led by NSF with Germany (Max Planck Society), the United Kingdom (Science and Technology Facilities Council), and Australia (Australian Research Council) making significant commitments and contributions to the project. More than 1,600 scientists from around the world participate in the effort through the LIGO Scientific Collaboration, which includes the GEO Collaboration. Additional partners are listed at my.ligo.org/census.php.
The Virgo Collaboration is currently composed of approximately 1,000 members from 175 institutions in 20 different (mainly European) countries. The European Gravitational Observatory (EGO) hosts the Virgo detector near Pisa, Italy, and is funded by the French National Center for Scientific Research, the National Institute of Nuclear Physics in Italy, the National Institute of Subatomic Physics in the Netherlands, The Research Foundation – Flanders, and the Belgian Fund for Scientific Research. A list of the Virgo Collaboration groups can be found on the project website.
KAGRA is the laser interferometer with 3-kilometer arm length in Kamioka, Gifu, Japan. The host institute is the Institute for Cosmic Ray Research of the University of Tokyo, and the project is co-hosted by the National Astronomical Observatory of Japan and the High Energy Accelerator Research Organization. The KAGRA collaboration is composed of more than 400 members from 128 institutes in 17 countries/regions. KAGRA’s information for general audiences is at the website gwcenter.icrr.u-tokyo.ac.jp/en/. Resources for researchers are accessible at gwwiki.icrr.u-tokyo.ac.jp/JGWwiki/KAGRA.
Study explains how a rare gene variant contributes to Alzheimer’s disease
A new study from MIT neuroscientists reveals how rare variants of a gene called ABCA7 may contribute to the development of Alzheimer’s in some of the people who carry it.
Dysfunctional versions of the ABCA7 gene, which are found in a very small proportion of the population, contribute strongly to Alzheimer’s risk. In the new study, the researchers discovered that these mutations can disrupt the metabolism of lipids that play an important role in cell membranes.
This disruption makes neurons hyperexcitable and leads them into a stressed state that can damage DNA and other cellular components. These effects, the researchers found, could be reversed by treating neurons with choline, an important building block precursor needed to make cell membranes.
“We found pretty strikingly that when we treated these cells with choline, a lot of the transcriptional defects were reversed. We also found that the hyperexcitability phenotype and elevated amyloid beta peptides that we observed in neurons that lost ABCA7 was reduced after treatment,” says Djuna von Maydell, an MIT graduate student and the lead author of the study.
Li-Huei Tsai, director of MIT’s Picower Institute for Learning and Memory and the Picower Professor in the MIT Department of Brain and Cognitive Sciences, is the senior author of the paper, which appears today in Nature.
Membrane dysfunction
Genomic studies of Alzheimer’s patients have found that people who carry variants of ABCA7 that generate reduced levels of functional ABCA7 protein have about double the odds of developing Alzheimer’s as people who don’t have those variants.
ABCA7 encodes a protein that transports lipids across cell membranes. Lipid metabolism is also the primary target of a more common Alzheimer’s risk factor known as APOE4. In previous work, Tsai’s lab has shown that APOE4, which is found in about half of all Alzheimer’s patients, disrupts brain cells’ ability to metabolize lipids and respond to stress.
To explore how ABCA7 variants might contribute to Alzheimer’s risk, the researchers obtained tissue samples from the Religious Orders Study/Memory and Aging Project (ROSMAP), a longitudinal study that has tracked memory, motor, and other age-related changes in older people since 1994. Of about 1,200 samples in the dataset that had genetic information available, the researchers obtained 12 from people who carried a rare variant of ABCA7.
The researchers performed single-cell RNA sequencing of neurons from these ABCA7 carriers, allowing them to determine which other genes are affected when ABCA7 is missing. They found that the most significantly affected genes fell into three clusters related to lipid metabolism, DNA damage, and oxidative phosphorylation (the metabolic process that cells use to capture energy as ATP).
To investigate how those alterations could affect neuron function, the researchers introduced ABCA7 variants into neurons derived from induced pluripotent stem cells.
These cells showed many of the same gene expression changes as the cells from the patient samples, especially among genes linked to oxidative phosphorylation. Further experiments showed that the “safety valve” that normally lets mitochondria limit excess build-up of electrical charge was less active. This can lead to oxidative stress, a state that occurs when too many cell-damaging free radicals build up in tissues.
Using these engineered cells, the researchers also analyzed the effects of ABCA7 variants on lipid metabolism. Cells with the variants altered metabolism of a molecule called phosphatidylcholine, which could lead to membrane stiffness and may explain why the mitochondrial membranes of the cells were unable to function normally.
A boost in choline
Those findings raised the possibility that intervening in phosphatidylcholine metabolism might reverse some of the cellular effects of ABCA7 loss. To test that idea, the researchers treated neurons with ABCA7 mutations with a molecule called CDP-choline, a precursor of phosphatidylcholine.
As these cells began producing new phosphatidylcholine (both saturated and unsaturated forms), their mitochondrial membrane potentials also returned to normal, and their oxidative stress levels went down.
The researchers then used induced pluripotent stem cells to generate 3D tissue organoids made of neurons with the ABCA7 variant. These organoids developed higher levels of amyloid beta proteins, which form the plaques seen in the brains of Alzheimer’s patients. However, those levels returned to normal when the organoids were treated with CDP-choline. The treatment also reduced neurons’ hyperexcitability.
In a 2021 paper, Tsai’s lab found that CDP-choline treatment could also reverse many of the effects of another Alzheimer’s-linked gene variant, APOE4, in mice. She is now working with researchers at the University of Texas and MD Anderson Cancer Center on a clinical trial exploring how choline supplements affect people who carry the APOE4 gene.
Choline is naturally found in foods such as eggs, meat, fish, and some beans and nuts. Boosting choline intake with supplements may offer a way for many people to reduce their risk of Alzheimer’s disease, Tsai says.
“From APOE4 to ABCA7 loss of function, my lab demonstrates that disruption of lipid homeostasis leads to the development of Alzheimer’s-related pathology, and that restoring lipid homeostasis, such as through choline supplementation, can ameliorate these pathological phenotypes,” she says.
In addition to the rare variants of ABCA7 that the researchers studied in this paper, there is also a more common variant that is found at a frequency of about 18 percent in the population. This variant was thought to be harmless, but the MIT team showed that cells with this variant exhibited many of the same gene alterations in lipid metabolism that they found in cells with the rare ABCA7 variants.
“There’s more work to be done in this direction, but this suggests that ABCA7 dysfunction might play an important role in a much larger part of the population than just people who carry the rare variants,” von Maydell says.
The research was funded, in part, by the Cure Alzheimer’s Fund, the Freedom Together Foundation, the Carol and Gene Ludwig Family Foundation, James D. Cook, and the National Institutes of Health.
Inside EPA’s analysis for killing the endangerment finding
Trump allies at odds over attacks on offshore wind
Johnson backs Virginia wind project in break with Trump
EPA to soon propose rolling back greenhouse gas reporting
Tesla veers away from EV dominance
Massachusetts moves to relax environmental rules for housing projects
Calif. lawmakers propose exempting climate disclosure laws from environmental review
Greta Thunberg’s Gaza flotilla was hit by a drone, crew says
African leaders push for climate investment at Ethiopia summit
As temperatures rise, Americans are consuming more sugar
Podcast Episode: Building and Preserving the Library of Everything
All this season, “How to Fix the Internet” has been focusing on the tools and technology of freedom – and one of the most important tools of freedom is a library. Access to knowledge not only creates an informed populace that democracy requires, but also gives people the tools they need to thrive. And the internet has radically expanded access to knowledge in ways that earlier generations could only have dreamed of – so long as that knowledge is allowed to flow freely.
%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F070870d9-d14a-4346-9cdd-a120a18d3475%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E
Privacy info.
This embed will serve content from simplecast.com
(You can also find this episode on the Internet Archive and on YouTube.)
A passionate advocate for public internet access and a successful entrepreneur, Brewster Kahle has spent his life intent on a singular focus: providing universal access to all knowledge. The Internet Archive, which he founded in 1996, now preserves 99+ petabytes of data - the books, Web pages, music, television, government information, and software of our cultural heritage – and works with more than 400 library and university partners to create a digital library that’s accessible to all. The Archive is known for the Wayback Machine, which lets users search the history of almost one trillion web pages. But it also archives images, software, video and audio recordings, documents, and it contains dozens of resources and projects that fill a variety of gaps in cultural, political, and historical knowledge. Kahle joins EFF’s Cindy Cohn and Jason Kelley to discuss how the free flow of knowledge makes all of us more free.
In this episode you’ll learn about:
- The role AI plays in digitizing, preserving, and easing access to all kinds of information
- How EFF helped the Internet Archive fight off the government’s demand for information about library patrons
- The importance of building a decentralized, distributed web to finding and preserving information for all
- Why building revolutionary, world-class libraries like the Internet Archive requires not only money and technology, but also people willing to dedicate their lives to the work
- How nonprofits are crucial to filling societal gaps left by businesses, governments, and academia
Brewster Kahle is the founder and digital librarian of the Internet Archive, which is among the world’s largest libraries and serves millions of people each day. After studying AI at and graduating from the Massachusetts Institute of Technology in 1982, Kahle helped launch the company Thinking Machines, a parallel supercomputer maker. In 1989, he helped create the internet's first publishing system called Wide Area Information Server (WAIS); WAIS Inc. was later sold to AOL. In 1996, Kahle co-founded Alexa Internet, which helps catalog the Web, selling it to Amazon.com in 1999. He is a former member of EFF’s Board of Directors.
Resources:
- EFF Legal Cases: Internet Archive et al v Mukasey et al (NSA letter/gag order)
- EFF Legal Cases: Hachette v. Internet Archive (publishers’ lawsuit)
- National Public Radio: “As the Trump administration purges web pages, this group is rushing to save them” (March 23, 2025)
- BBC: “An inside look at how the Internet Archive saves the web” (May 26, 2025)
- American Libraries: “Newsmaker: Brewster Kahle” (June 4, 2025)
What do you think of “How to Fix the Internet?” Share your feedback here.
TranscriptBREWSTER KAHLE: I think we should start making some better decisions, a little bit more informed, a little better communication with not only people that are around the world and finding the right people we should be talking to, but also, well, standing on the shoulders of giants. I mean, we can then go and learn from all the things that people have learned in the past. It's pretty straightforward what we're trying to do here. It's just build a library.
CINDY COHN: That's Internet Archive founder Brewster Kahle on what life could look like we all got to experience his dream of universal access to all human knowledge.
I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation
JASON KELLEY: And I'm Jason Kelley - EFF's activism director. And this is our podcast How to Fix the Internet.
CINDY COHN: This show is about what the world could look like if we get things right online - we hear from activists, computer engineers, thinkers, artists and today, a librarian, about their visions for a better digital future that we can all work towards.
JASON KELLEY: And our guest today is someone who has been actively making the internet a better place for several decades now.
CINDY COHN: Brewster Kahle is an early internet pioneer, and a longtime advocate for digitization. He’s a computer engineer but also a digital librarian, and he is of course best known as the founder of the Internet Archive and the Wayback Machine. EFF and the Archive are close allies and friends, and Brewster himself was a member of EFF’s Board of Directors for many years. I’m proud to say that the Archive is also a client of EFF, including most recently when we served as part of the legal team trying to protect true library lending of digital materials like ebooks and audiobooks.
JASON KELLEY: All season we’ve been focusing on the tools and technologies of freedom – and one of the most important tools of freedom is a library.
We started off our conversation by getting his take on the role that AI should play in his vision of a universally accessible library.
BREWSTER KAHLE: AI is absolutely critical and actually has been used for, well, a long period of time. You just think of, how does the magic of Google search happen, where you can just type a few words and get 10 links and several of them are actually really quite relevant. How do you do that? Those of us old enough to remember just keyword searching, that didn't work very well.
So it's going and using all this other information, metadata from other websites, but also learning from people, and machine learning at scale, that we've been able to make such progress.
Now there's the large language models, the generative AI, which is also absolutely fantastic. So we are digitizing obscure newsletters from theological missions in distant parts of the world. We are digitizing agricultural records and from over decades of the 20th century.
And these materials are absolutely relevant now with climate change in our new environments because, well, things are moving. So the pests that used to be only in Mexico are now in Louisiana and Texas. It's completely relevant to go and learn from these, but it's not gonna be based on people going and doing keyword search and finding that newsletter and, and learning from it. It's gonna be based on these augmentations, but take all of these materials and try to make it useful and accessible to a generation that's used to talking to machines.
CINDY COHN: Yeah, I think that that's a really important thing. One of my favorite insights about AI is that it's a very different user interface. It's a way to have a conversational access to information. And I think AI represents one of those other shifts about how people think about accessing information. There's a lot of side effects of AI and we definitely have to be serious about those. But this shift can really help people learn better and find what they're looking for, but also find things that maybe they didn't think they were looking for.
BREWSTER KAHLE: If we do it well, if we do it with public AI that is respectful, the opportunity for engaging people and in a more deep way to be able to have them get to literature that has been packed away, and we've spent billions of dollars in the library system over centuries going and building these collections that are now going to be accessible, not just to the reference librarian, not just to researchers, but to kind of anybody.
JASON KELLEY: Can I dig into this backstory of yours a little bit? Because you know, a lot of people may know how you ended up building the Internet Archive, but I don't think they know enough. I'd like to get more people to sort of have a model in tech for what they can do if they're successful. And you were, if I understand it right, you were one of the early successful internet stories.
You sold a company or two in the nineties and you could have probably quit then and instead you ended up building the Internet Archive. Did you have this moment of deciding to do this and how did you end up in library school in the first place?
BREWSTER KAHLE: So I'm a little unusual in that I, I've only had one idea in my life, and so back in college in 1980 a friend posed, okay, you're an idealist. Yes. And a technologist. Yes. Paint a portrait that's better with your technology. It turned out that was an extremely difficult question to answer.
We were very good about complaining about things. You know, that was Cold War Times and Nicaragua and El Salvador, and there's lots of things to complain about, but it was like. What would be better? So I only came up with two ideas. one was protect people's privacy, even though they were going to throw it away if they were given the chance.
And the other was build the library of everything, the building of the library of everything, the digital library of Alexandria seemed too obvious. So I tried to work on the privacy one, but I couldn't make chips to encrypt voice conversations cheap enough to help the people I wanted to, but I learned how to make chips.
But then that got me engaged with the artificial intelligence lab at MIT and Danny Hillis and Marvin Minsky, they had this idea of building a thinking machine and to go and build a computer that was large enough to go and search everything. And that seemed absolutely critical.
So I helped work on that. Founded a company, Thinking Machines. That worked pretty well. So we got the massively parallel computers. We got the first search engine on the internet, then spun off a company to go and try to get publishers online called WAIS Incorporated. It came before the web, it was the first publishing system.
And so these were all steps in the path of trying to get to the library. So once we had publishers online, we also needed open source software. The free and open source software movement is absolutely critical to the whole story of how this whole thing came about, and open protocols, which was not the way people thought of things. They would go and make them proprietary and sue people and license things, but the internet world had this concept of how to share that ran very, very well. I wasn't central in the ARPANET to the internet conversation. But I did have quite a bit to do with some of the free and open source software, the protocol development, the origins of the web.
And once we had publishers, then, onboard, then I could turn my attention to building the library in 1996, so that's 28 years ago, something like that. And so we then said, okay, now we can build the library. What does that make up of? And we said, well, let's start with the web. Right? The most fragile of media.
I mean, Tim's system, Tim Berners-Lee's system, was very easy to implement, which was kind of great and one of the keys for his success, but it had some really, basically broken parts of it. You think of publishers and they would go and make copies and sell them to individuals or libraries, and they would stay alive much longer than the publishers.
But the web, there's only one copy and it's only on one machine. And so if they change that, then it's gone. So you're asking publishers to be librarians, which is a really bad idea. And so we thought, okay, why don't we go and make a copy of everything that was on the web. Every page from every website every two months.
And turns out you could do that. That was my Altavista moment when I actually went to see Altavista. It was the big search engine before Google and it was the size of two Coke machines, and it was kind of wild to go and look - that's the whole web! So the idea that you could go and gather it all back up again, uh, was demonstrated by Altavista and the Internet Archive continued on with other media type after media type, after media type.
JASON KELLEY: I heard you talk about the importance of privacy to you, and I know Cindy's gonna wanna dig into that a little bit with some of the work that EFF and the Archive have done together.
CINDY COHN: Yeah, for sure. One of the things I think, you know, your commitment to privacy is something that I think is very, very important to you and often kind of gets hidden because the, you know, the archive is really important. But, you know, we were able to stand up together against national security letters, you know, long before some of the bigger cases that came later and I wanted to, you know, when you reached out to us and said, look, we've gotten this national security letter, we wanna fight back. Like, it was obvious to you that we needed to push back. And I wanna hear you talk about that a little bit.
BREWSTER KAHLE: Oh, this is a hero day. This is a hero moment for EFF and its own, you know, I, okay.
CINDY COHN: Well, and the Archive, we did it together.
BREWSTER KAHLE: Well, no, we just got the damn letter. You saved our butts. Okay. So how this thing worked was in 2001,they passed this terrible law, the Patriot Act, and they basically made any government official almost be able to ask any organization and be able to get anything they wanted and they had a gag order. So not only could they just get any information, say on patrons’ reading habits in a library, they could make it so that you can't tell anybody about it.
So I got sat down one day and Kurt Opsahl from EFF said, this isn't your best day. You just got a letter demanding information about a patron of the Internet Archive. I said, they can't do that. He said, yeah, they can. And I said, okay, well this doesn't make any sense. I mean, the librarians have a long history of dealing with people being surveilled on what it is they read and then rounded up and bad things happen to them, right? This is, this is something we know how that movie plays out.
So I said, Kurt, what, what can we do? And he said, you have to supply the data. I said, what if we don't? And he said, jail. That wasn't my favorite sentence. So is there anything else we can do? And he said, well, you can sue the United States government. (laughter)
OH! Well I didn't know even know whether I could bring this up with my board. I mean, remember there's a gag order. So there was just a need to know to be able to find out from the engineers what it is we had, what we didn't have. And fortunately we never had very much information. 'cause we don't keep it, we don't keep IP addresses if we possibly can. We didn't have that much, but we wanted to push back. And then how do you do that? And if it weren't for the EFF, and then EFF got the ACLU involved on a pro bono basis, I would never have been able to pull it off! I would have to have answered questions to the finance division of how, why are we spending all this money on lawyers?
The gag order made it so absolutely critical for EFF to exist, and to be ready and willing and funded enough to take on a court case against the United States government without, you know, having to go into a fundraising round.
But because of you, all of you listeners out there donating to EFF, having that piggy bank made it so that they could spring to the defense of the Internet Archive. The great thing about this was that after this lawsuit was launched, the government wanted out of this lawsuit as fast as possible.
They didn't want to go and have a library going and getting a court case to take their little precious toy of this Patriot Act, National Security letters away from them. So they wanted out, but we wouldn't let them. We wanted to be able to talk about it. They had to go and release the gag order. And I think we're only one or two or three organizations that have ever talked publicly about the hundreds of thousands, if not millions, of national security letters because we had EFF support.
CINDY COHN: Oh, thank you Brewster. That's very sweet. But it was a great honor to get to do this. And in hearing you talk about this future, I just wanna pull out a few of the threads. One is privacy and how important that is for access for information. Some people think of that as a different category, right? And it's not. It's part and parcel of giving people access to information.
I also heard the open source community and open protocols and making sure that people can, you know, crawl the web and do things with websites that might be different than the original creator wanted, but are still useful to society.
The other thing that you mentioned that I think it's important to lift up as well is, you know, when we're talking about AI systems, you're talking about public AI, largely. You're talking about things that similarly are not controlled by just one company, but are available so that the public really has access not only to the information, but to the tools that let them build the next thing.
BREWSTER KAHLE: Yes, the big thing I think I may have gotten wrong starting this whole project in 1980 was the relaxation of the antitrust laws in the United States, that we now have these monster organizations that are not only just dominating a country's telecom or publishing systems or academic access, but it's worldwide now.
So we have these behemoth companies. That doesn't work very well. We want a game with many winners. We want that level playing field. We wanna make it so that new innovators can come along and, you know, try it out, make it go. In the early web, we had this, we watched sort of the popularity and the movement of popularity. And so you could start out with a small idea and it could become quite popular without having to go through the gatekeepers. And that was different from when I was growing up. I mean, if you had a new idea for a kid's toy, trying to get that on the shelves in a bunch of toy stores was almost impossible.
So the idea of the web and the internet made it so that good ideas could surface and grow, and that can work as long as you don't allow people to be gatekeepers.
We really need a mechanism for people to be able to grow, have some respect, some trust. If we really decrease the amount of trust, which is kind of, there's a bonfire of trust right now, then a lot of these systems are gonna be highly friction-full.
And how do we go and make it so that, you know, we have people that are doing worthwhile projects, not exploiting every piece of surveillance that they have access to. And how do we build that actually into the architecture of the web?
CINDY COHN: That leads, I think, directly into the kind of work that the archive has done about championing the distributed web, the D-web work. And you've done a real lot of work to kind of create a space for a distributed web, a better web. And I want you to tell me a little bit about, you know, how does that fit into your picture of the future?
BREWSTER KAHLE: The wonderful thing about the internet still is that it can be changed. It's still built by people. They may be in corporations, but you can still make a big dent and, there were a couple “aha” moments for me in, in trying to, like, why do we build a better web? Right? what's the foundational parts that we need to be able to do that?
And we ended up with this centralization, not only of all the servers being in these colos that are operated by other companies and a cloud-based thing, other people own everything, that you can't go and just take your computer on your desk and be a first class internet thing. That used to be possible with Gopher and Waze and the early web. So we lost some of those things, but we can get them back.
Jason Scott at the Internet Archive, working with volunteers all over, made emulators of the early computers like IBM PCs and Macintosh and these old computers, Commodore 64, Atari machines, and they would run in JavaScript in your browser, so you could click and go and download an IBM PC and it boots in your browser and it uses the Internet Archive as a giant floppy drive to run your favorite game from 20 years ago. The cool thing about that for me, yes, I could get to play all my old games, it was kind of great, but we also had this ability to run a full on computer in your browser, so you didn't even have to download and install something.
So you could go and be a computer on the internet, not just a consumer, a reader. You could actually be a writer, you could be a publisher, you could, you could do activities, you could, so that was fantastic. And then another big change was the protocols of the browsers change to allow peer-to-peer interactions. That's how you get, you know, Google Meet or you get these video things that are going peer to peer where there's no central authority going in, interrupting your video streams or whatever.
So, okay, with these tools in hand now, then we could try to realize part of the dream that a lot of us had originally, and even Tim Burners Lee, of building a decentralized web. Could you make a web such that your website is not owned and controlled on some computer someplace, but actually exists everywhere and nowhere, kind of a peer-to-peer backend for the web.
Could you make it so that if you run a club, that you could do a WordPress-like website that would then not live anywhere, but as readers were reading it, they would also serve it. And there would be libraries that would be able to go and archive it as a living object, not as just snapshots of pages. That became possible. It turns out it's still very hard, and the Internet Archive started pulling together people, doing these summits and these different conferences to get discussions around this and people are running with it.
CINDY COHN: Yeah, and so I love this because I know so many people who go to the archive to play Oregon Trail, right? And I love it when I get a chance to say, you know, this isn't just a game, right? This is a way of thinking that is reflected in this. I kind of love that, you know, ‘you died with dysentery’ becomes an entryway into a whole other way of thinking about the web.
JASON KELLEY: Let's take a quick moment to thank our sponsor. How to Fix The Internet is supported by the Alfred P. Sloan Foundation's program and public understanding of science and technology enriching people's lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
We also wanna thank EFF donors. You're the reason we exist, and EFF has been fighting for digital rights for 35 years, and that fight is bigger than ever. So please, if you like what we do, go to eff.org/pod to donate. And also, if you can’t make it in person to this year’s EFF awards where we celebrate the people working towards the better future we all care so much about, you can watch the whole event at eff.org/awards.
We also wanted to share that our friend Cory Doctorow, has a new podcast, have a listen to this:
WHO BROKE THE INTERNET TRAILER: How did the internet go from this? You could actually find what you were looking for right away, to this, I feel I can inhale. Spoiler alert, it was not an accident. I'm Cory Doctorow, host of Who Broke the Internet from CBC's Understood. In this four part series, I'm gonna tell you why the internet sucks now, whose fault it is and my plan to fix it. Find who broke the internet on whatever terrible app you get your podcasts.
JASON KELLEY: And now back to our conversation with Brewster Kahle.
The fact that you do things like archive these old games is something that I think a lot of people don't know. There are just so many projects that the internet archive does and it is interesting to hear how they're sort of all building towards this better future that is sort of built, like, sort of makes up the bones of the work that you do. Can you talk about any of the other projects that you are particularly sort of proud of that maybe other people haven't heard about?
BREWSTER KAHLE: Yeah, and I really wanna apologize. If you go to archive.org, it is daunting. Most people find things to read in the Internet Archive or see in the internet archive, mostly by going to search engines, or Wikipedia. For instance, we really dedicated ourselves to try to help reinforce Wikipedia. We started archiving all of the outbound links. And we figured out how to work with the communities to allow us to fix those broken links. So we've now fixed 22 million broken links in Wikipedia, 10,000 a day get now added to go back to the Wayback Machine.
Also, there are about two million books that are linked straight into, if you click on it, it goes right to the right page so you can go and see the citation. Not only is this important for homework, people that are after hours trying to cram for their, uh, for their homework, um, but it's also important for Wikipedians because, um, links in Wikipedia that go to someplace you can actually cite is a link that works, it gets more weight.
And if we're going to have all the literature, the scholarly literature and the book literature available in Wikipedia, it needs to be clickable. And you can't click your way into a Overdrive borrowed book from your library. You have to be able to do this from, something like the Internet Archive. So Wikipedia, reinforcing Wikipedia.
Another is television. We've been archiving television. 24 hours a day since the year 2000. Russian, Chinese, Japanese, Iraqi, Al Jazeera, BBC, CNN, ABC, Fox, 24 hours a day, DVD quality. And not all of it is available but the US television news, you can search and find things. And we're also doing summarizations now, so you can start to understand – in English – what is Russian State television telling the Russians? So we can start to get perspectives. Or look inside other people's bubbles to be able to get an idea of what's going on. Or a macroscope ability to step back and get the bigger picture. That's what libraries are for, is to go and use these materials in new and different ways that weren't the way that the publishers originally intended.
Other things. We've digitizing about 3000 books a day. So that's going along well. Then we are doing Democracy’s Library. Democracy's Library, I think is a cool one. So democracies need an educated populace. So they tend to publish openly. Authoritarian governments and corporations don't care about having an educated populace. That's not their goal. They have other goals, um, but democracies want things to be openly available.
But it turns out that even though the United States, for instance, and all democracies publish openly, most of those materials are not available publicly. They may be available in some high priced database system of somebody or other. But mostly they're just not available at all.
So we launched the Democracy's Library Project to go and take all of the published works at the federal level, the provincial state level, and municipal levels, and make that all available in bulk and in services so that other people could also go and build new services on this. We launched it with Canada and the United States. The Canadians are kicking the United States's butt. I mean, they're doing so great. So Internet Archive Canada, working with University of Toronto, and universities all over, have already digitized all of the federal print materials, and by working with the national library there have archived the government websites in Canada.
In the United States we've been archiving, with the help of many others, including historically with the Library of Congress, and National Archives to go and collect all of the web pages and services and data sets from all of the United States Federal websites from before and after every presidential election. It's called the End of Term Crawl, and this has been going on since 2008, and we've gotten into a lot of news recently because this administration has decided to take a lot of materials off the web. And again, asking a publisher, whether it's a government or commercial publisher or a social media publisher, to go and be their own archive or their own library is a bad idea. Don't trust a corporation to do a library's job, was what one headline said.
So we've been archiving all of these materials and making them available. Now, can we weave them back into the web with the right URLs? No, not yet. That's up to the browser companies and also some of the standards organizations. But it's, at least it's there and you can go to the Wayback Machine to find it.
So the Internet Archive is about the 200th most popular website.
We get millions of people a day coming to the website, and we get about 6 million people coming and using the internet archives resources that we don't even, they don't even come to the website. So it's just woven into the fabric of the web. So people say, oh, I've never heard of that. Never used it. It's like you probably have. It’s just part of how the internet works, it's plumbing.
So those are the aspects of the Internet Archive that are currently going on. We have people coming in all the time saying. Now, but are you doing this? And I said, no, but you can and we can be infrastructure for you. I think of the Internet Archive as infrastructure for obsessives. So the people that say, I really need this to persist to the next generation. We say, great, what do you need? How do we make that come true?
CINDY COHN: Yeah, I think that's both the superpower and in some ways the thing that the Internet Archive struggles with, which is because when your infrastructure, people don't think about you and they don't wanna think about you, so that when you come under attack, it's hard to get people to see what they might be losing.
And I think one of the things that, you know, one of the reasons I wanted you to come on here and talk about the archive is I think we need to start making some of that invisible stuff visible because it's not magic. It's not automatic. It takes, you know, I mean, your personal courage in standing up is wonderful, but there need to be hundreds and thousands and hundreds of thousands saying, you know, this is our library, this is our future.
This is, you know, this is important and, and we need to stand up and hopefully if we stand up enough, you know, we don't have to do it every four years or so. But you know, the number of people who I sent to the Wayback Machine when they were very, very worried about US government information going down and, and pointed out, look, you know, the archive's been quietly doing this for, you know, nearly 20 years now, is a lot. And that's because again, you're kind of quietly doing the important work.
And so, you know, my hope is that ,with this podcast and otherwise, we get a little more attention so that we can really build this better future and, and maybe in the better future, we don't have to think about it again. But right now there's a lot of different kinds of attacks.
BREWSTER KAHLE: It's a challenging time, especially in the United States for libraries. There's the book bannings, defunding. Probably structurally the worst thing is the licensing model. The idea that there's no digital ownership. I mean, just like really bad behavior on the part of the corporations. Um, so, but Internet Archive Canada is doing well. Internet Archive Europe is coming back up and serving interesting roles with public AI to go and do publicly oriented values driven AI technology, which is kind of great. We'd like to see internet archives planted in lots of places. The idea that we can just depend on the United States jurisdictions for being the information resource for the world I think that train is gone.
So let's go and build a robust infrastructure. It's kinda like what we saw out the internet. Can we build internet archives all over the world? And that takes not only money, but actually the money part is probably not the hardest part. It's people interested in dedicating their lives to open – to open source software, free and open source software, open access materials, the infrastructure to step out and work in non-profits as opposed to some of the, you know, the very tempting, um, stock option deals that come from these these VC-funded whatevers, um, and work and do the good work that they can point to and they can be proud of for the rest of their lives.
CINDY COHN: Yeah. And there is something so important about that, about getting to wake up every day and feel like you're making the world better. And I think your particular story about this, because you know, you made money early on, you did some companies and you decided to dig back into the public side of the work rather than, you know, stepping back and becoming a VC or, you know, buying your third island, or those kinds of things.
And I think that one of the things that's important is that I feel like there's a lot of people who don't think that you can be a technologist and a successful person without being an asshole. And, you know, I think you're a good counter example of somebody who is deeply technical, who thinks about things in a, you know, how do we build better infrastructure, who understands how all of these systems work. And use that information to build good, rather than, you know, necessarily deciding that the, you know, the best thing to do is to maybe take over a local government and build a small fiefdom to yourself.
BREWSTER KAHLE: Well, thank you for that. And yes, for-profit entities are gasoline. They're explosive and they don't tend to last long. But I think one of the best ideas the United States has come up with is the 501 C3 public charity, which is not the complete antidote to the C corporations that were also put across by the United States since World War II in ways that shouldn't have been, but the 501 C3 public charities are interesting. They tend to last longer. They take away the incentive to sell out, yet leave an ability to be an operational entity. You just have to do public good. You have to actually live and walk the walk and go and do that. But I think it's a fabulous structure. I mean, you, Cindy, how old is the EFF now?
CINDY COHN: 35. This is our 35th anniversary.
BREWSTER KAHLE: That's excellent. And the Internet Archive is like 28, 29 years old, and that's a long time for commercial, excuse me, for commercial entities or tech! Things in the tech world, they tend to turn over. So if you wanna build something long term, and you're willing to only do, as Lessig would put it, some rights reserved, or some profit motive reserved. Then the 501 C3 public charities, what other countries are adopting, this model is a mechanism of building infrastructure that can last a long time where you get your alignment with the public interest.
CINDY COHN: Yeah, I think that's right. And it's been interesting to me for the, you know, being in this space for a really long time, the nonprofit salaries, the nonprofit may not be as high, but the jobs are more stable. Like we don't have in our sector the waves of layoffs. I mean, occasionally for sure we're, you know, that that is a thing that happens in the nonprofit digital rights sector. But I would say compared to the for-profit world, there’s a much more stable structure, um, because you don't have this gasoline idea, these kind of highs and lows and ups and downs. And that could be, you know, there's nothing wrong with riding that wave and making some money. But the question becomes, well, what do you do after that? Do you take that path to begin with? Or do you take that path later, when you've got some assets, you know, some people come outta school with loans and things like that.
BREWSTER KAHLE: So we need this intermediary between the academic, the dot edu, and the dot com, and I think the dot org is such a thing. And also there was a time when we did a lot in dot gov of bringing civic tech. And civic tech in Canada is up and running and wonderful. So there's things that we can do in that.
We can also spread these ideas into other sectors like banking. How about some nonprofit banks, please? Why don't we have some nonprofit housing that actually supports nonprofit workers? We're doing an experiment with that to try to help support people that want to work in San Francisco for nonprofits and not feel that they have to commute from hours away.
So can we go and take some of these ideas pioneered by Richard Stallman, Larry Lessig, Vince Sur, the Cindy Cohns, and go and try it in new sectors? You're doing a law firm, one of the best of the Silicon Valley law firms, and you give away your product. Internet Archive gives away its product. Wikipedia gives away its product. This is, like, not supposed to happen, but it works really well. And it requires support and interest of people to work there and also to support it from the outside. But it functions so much better. It's less friction. It's easier for us to work with non other non-profits than it is to work with for-profits.
JASON KELLEY: Well I'm glad that you brought up the nonprofit points and really dug into it because earlier, Brewster, you mentioned the people listening to this are, you know, the reason you were able to fight back against the NSL letters is that EFF has supporters that keep it going, and those same supporters, the people listening to this are hopefully, and probably, the ones that help keep the Archive going. And I just wanted to make sure people know that the Archive is also supported by donors. And, uh, if people like it, they, they, there's nothing wrong with supporting both EFF and the Archive, and I hope everyone does both.
CINDY COHN: Yeah. There's a whole community. And one of the things that Brewster has really been a leader in is seeing and making space for us to think of ourselves as a community. Because we're stronger together. And I think that's another piece of the somewhat quiet work that Brewster and the Archive do is knitting together the open world into thinking of itself as an open world and, able to move together and leverage each other.
BREWSTER KAHLE: Well thank you for all the infrastructure EFF provides. And if anybody's in San Francisco, come over on a Friday afternoon at ! And we give it to her! If I'm here, I give it to her and try to help answer questions. We even have ice cream. And so the idea is to go and invite people into this other alternative form of success that maybe they weren't taught about in business school or, or, or, uh, you know, they want to go off and do something else.
That's fine, but at least understand a little bit of how the underlying structures of the internet, whether it's some of the original plumbing, um, some of these visions of Wikipedia, Internet Archive. How do we make all of this work? And it's by working together, trusting each other to try to do things right, even when the technology allows you to do things that are abusive. Stepping back from that and building, uh, the safeguards into the technology eventually, and celebrate what we can get done to support a better civic infrastructure.
CINDY COHN: That is the perfect place to end it. Thank you so much, Brewster, for coming on and bringing your inspiration to us.
JASON KELLEY: I loved that we wrapped up the season with Brewster because really there isn't anything more important, in a lot of ways, to freedom than a library. And the tool of freedom that Brewster built, the Internet Archive and all of the different pieces of it, is something that I think is so critical to how people think about the internet and what it can do, and honestly, it's taken for granted. I think once you start hearing Brewster talk about it, you realize just how important it is. I just love hearing from the person who thought of it and built it.
CINDY COHN: Yeah, he's so modest. The “I only had one idea,” right? Or two ideas, you know, one is privacy and the other is a universal access to all the world's information. You know, just some little things.
JASON KELLEY: Just a few things that he built into practice.
CINDY COHN: Well, and you know, he and a lot of other people, I think he's the first to point out that this is a sector that there's a lot of people working in this area and it's important that we think about it that way.
It does take the long view to build things that will last. And then I think he also really talked about the nonprofit sector and how, you know, that space is really important. And I liked his framing of it being kind of in between the dot edu, the academics and the dot com, that the dot orgs play this important role in bringing the public into the conversation about tech, and that's certainly what he's done.
JASON KELLEY: I loved how much of a positive pitch this was for nonprofits, and I think a lot of people think of charities they don't think about EFF necessarily, or the Internet Archive, but this tech sector of nonprofits is, you know, that community you talked about all working together to sort of build this structure that protects people's rights online and also gives them access to these incredible tools and projects and resources and, you know, everyone listening to this is probably a part of that community in one way or another. It's much bigger than I think people realize.
CINDY COHN: Yeah. And whether you're contributing code or doing lawyering or doing activism, you know, there's, there's spaces throughout, and those are only just three that we do.
But the other piece, and, and you know, I was very of course honored that he told the story about national security letters, but, you know, we can support each other. Right. That when somebody in this community comes under attack, that's where EFF often shows up. But when, you know, he said people have ideas and they wanna be able to develop them, you know, the archive provides the infrastructure. All of this stuff is really important and important to lean into in this time when we're really seeing a lot of public institutions and nonprofit institutions coming under attack.
What I really love about this season, Jason, is the way we've been able to shine our little spotlight on a bunch of different pieces of the sector. And there's so many more. You know, as somebody who started in this digital world in the nineties when, you know, I could present all of the case law about the internet on one piece of paper in a 20 minute presentation.
You know, watching this grow out and seeing that it's just the beginning has been really, it's been really fun to be able to talk to all of these pieces. And you know, to me the good news is that, that people, you know, sometimes their stories get presented as if they're alone or if there's this lone, you know, it's kind of a superhero narrative. There's this lone Brewster Kahle who's out there doing things, and now of course that's true. Brewster's, you know, again, Brewster's somebody who I readily point to when people need an example of somebody who, who did really well in tech but didn't completely become a money grubbing jerk as a result of it, but instead, you know, plowed it back into the community. It's important to have people like that, but it's also important to recognize that this is a community and that we're building it, and that it’s got plenty of space for the next person to show up and, and throw in ideas.
At least I hope that's how, you know, we fix the internet.
JASON KELLEY: And that's it for this episode and for this season. Thank you to Brewster for the conversation today, and to all of our guests this season for taking the time to share their insight, experience, and wisdom with us these past few months. Everybody who listens, gets to learn a little bit more about how to fix the internet.
That is our goal at EFF. And every time I finish one of these conversations, I think, wow, there's a lot to do. So thank you so much for listening. If you wanna help us do that work, go to eff.org/pod and you can donate, become a member, and um, we have 30,000 members, but we could always use a few more because there is a lot to fix.
Thank you so much. Our theme music is by Nat Keefe of Beat Millware with Reed Mathis. And How to Fix the Internet is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. I'm Jason Kelley.
CINDY COHN: And I'm Cindy Cohn.
MUSIC CREDITS: This podcast is licensed Creative Commons Attribution 4.0 international, and includes the following music licensed Creative Commons Attribution 3.0 unported by its creators: Drops of H2O, The Filtered Water Treatment by Jay Lang. Additional music, theme remixes and sound design by Gaetan Harris.