Feed aggregator

3 Questions: How a new mission to Uranus could be just around the corner

MIT Latest News - Tue, 09/30/2025 - 8:00am

The successful test of SpaceX’s Starship launch vehicle, following a series of engineering challenges and failed launches, has reignited excitement over the possibilities this massive rocket may unlock for humanity’s greatest ambitions in space. The largest rocket ever built, Starship and its 33-engine “super heavy” booster completed a full launch into Earth orbit on Aug. 26, deployed eight test prototype satellites, and survived reentry for a simulated landing before coming down, mostly intact, in the Indian Ocean. The 400-foot rocket is designed to carry up to 150 tons of cargo to low Earth orbit, dramatically increasing potential payload volume from rockets currently in operation. In addition to the planned Artemis III mission to the lunar surface and proposed missions to Mars in the near future, Starship also poses an opportunity for large-scale scientific missions throughout the solar system.

The National Academy of Sciences Planetary Science Decadal Survey published a recommendation in 2022 outlining exploration of Uranus as its highest-priority flagship mission. This proposed mission was envisioned for the 2030s, assuming use of a Falcon Heavy expendable rocket and anticipating arrival at the planet before 2050. Earlier this summer, a paper from researchers in MIT’s Engineering Systems Lab found that Starship may enable this flagship mission to Uranus in half the flight time. 

In this 3Q, Chloe Gentgen, a PhD student in aeronautics and astronautics and co-author on the recent study, describes the significance of Uranus as a flagship mission and what the current trajectory of Starship means for scientific exploration.

Q: Why has Uranus been identified as the highest-priority flagship mission? 

A: Uranus is one of the most intriguing and least-explored planets in our solar system. The planet is tilted on its side, is extremely cold, presents a highly dynamic atmosphere with fast winds, and has an unusual and complex magnetic field. A few of Uranus’ many moons could be ocean worlds, making them potential candidates in the search for life in the solar system. The ice giants Uranus and Neptune also represent the closest match to most of the exoplanets discovered. A mission to Uranus would therefore radically transform our understanding of ice giants, the solar system, and exoplanets. 

What we know about Uranus largely dates back to Voyager 2’s brief flyby nearly 40 years ago. No spacecraft has visited Uranus or Neptune since, making them the only planets yet to have a dedicated orbital mission. One of the main obstacles has been the sheer distance. Uranus is 19 times farther from the sun than the Earth is, and nearly twice as far as Saturn. Reaching it requires a heavy-lift launch vehicle and trajectories involving gravity assists from other planets. 

Today, such heavy-lift launch vehicles are available, and trajectories have been identified for launch windows throughout the 2030s, which resulted in selecting a Uranus mission as the highest priority flagship in the 2022 decadal survey. The proposed concept, called Uranus Orbiter and Probe (UOP), would release a probe into the planet’s atmosphere and then embark on a multiyear tour of the system to study the planet’s interior, atmosphere, magnetosphere, rings, and moons. 

Q: How do you envision your work on the Starship launch vehicle being deployed for further development?

A: Our study assessed the feasibility and potential benefits of launching a mission to Uranus with a Starship refueled in Earth’s orbit, instead of a Falcon Heavy (another SpaceX launch vehicle, currently operational). The Uranus decadal study showed that launching on a Falcon Heavy Expendable results in a cruise time of at least 13 years. Long cruise times present challenges, such as loss of team expertise and a higher operational budget. With the mission not yet underway, we saw an opportunity to evaluate launch vehicles currently in development, particularly Starship. 

When refueled in orbit, Starship could launch a spacecraft directly to Uranus, without detours by other planets for gravity-assist maneuvers. The proposed spacecraft could then arrive at Uranus in just over six years, less than half the time currently envisioned. These high-energy trajectories require significant deceleration at Uranus to capture in orbit. If the spacecraft slows down propulsively, the burn would require 5 km/s of delta v (which quantifies the energy needed for the maneuver), much higher than is typically performed by spacecraft, which might result in a very complex design. A more conservative approach, assuming a maximum burn of 2 km/s at Uranus, would result in a cruise time of 8.5 years. 

An alternative to propulsive orbit insertion at Uranus is aerocapture, where the spacecraft, enclosed in a thermally protective aeroshell, dips into the planet’s atmosphere and uses aerodynamic drag to decelerate. We examined whether Starship itself could perform aerocapture, rather than being separated from the spacecraft shortly after launch. Starship is already designed to withstand atmospheric entry at Earth and Mars, and thus already has a thermal protection system that could, potentially, be modified for aerocapture at Uranus. While bringing a Starship vehicle all the way to Uranus presents significant challenges, our analysis showed that aerocapture with Starship would produce deceleration and heating loads similar to those of other Uranus aerocapture concepts and would enable a cruise time of six years.

In addition to launching the proposed spacecraft on a faster trajectory that would reach Uranus sooner, Starship’s capabilities could also be leveraged to deploy larger masses to Uranus, enabling an enhanced mission with additional instruments or probes.

Q: What does the recent successful test of Starship tell us about the viability and timeline for a potential mission to the outer solar system?

A: The latest Starship launch marked an important milestone for the company after three failed launches in recent months, renewing optimism about the rocket’s future capabilities. Looking ahead, the program will need to demonstrate on-orbit refueling, a capability central to both SpaceX’s long-term vision of deep-space exploration and this proposed mission.

Launch vehicle selection for flagship missions typically occurs approximately two years after the official mission formulation process begins, which has not yet commenced for the Uranus mission. As such, Starship still has a few more years to demonstrate its on-orbit refueling architecture before a decision has to be made.

Overall, Starship is still under development, and significant uncertainty remains about its performance, timelines, and costs. Even so, our initial findings paint a promising picture of the benefits that could be realized by using Starship for a flagship mission to Uranus.

3 Questions: Addressing the world’s most pressing challenges

MIT Latest News - Tue, 09/30/2025 - 8:00am

The Center for International Studies (CIS) empowers students, faculty, and scholars to bring MIT’s interdisciplinary style of research and scholarship to address complex global challenges. 

In this Q&A, Mihaela Papa, the center's director of research and a principal research scientist at MIT, describes her role as well as research within the BRICS Lab at MIT — a reference the BRICS intergovernmental organization, which comprises the nations of Brazil, Russia, India, China, South Africa, Egypt, Ethiopia, Indonesia, Iran and the United Arab Emirates. She also discusses the ongoing mission of CIS to tackle the world's most complex challenges in new and creative ways.

Q: What is your role at CIS, and some of your key accomplishments since joining the center just over a year ago?

A: I serve as director of research and principal research scientist at CIS, a role that bridges management and scholarship. I oversee grant and fellowship programs, spearhead new research initiatives, build research communities across our center's area programs and MIT schools, and mentor the next generation of scholars. My academic expertise is in international relations, and I publish on global governance and sustainable development, particularly through my new BRICS Lab. 

This past year, I focused on building collaborative platforms that highlight CIS’ role as an interdisciplinary hub and expand its research reach. With Evan Lieberman, the director of CIS, I launched the CIS Global Research and Policy Seminar series to address current challenges in global development and governance, foster cross-disciplinary dialogue, and connect theoretical insights to policy solutions. We also convened a Climate Adaptation Workshop, which examined promising strategies for financing adaptation and advancing policy innovation. We documented the outcomes in a workshop report that outlines a broader research agenda contributing to MIT’s larger climate mission.

In parallel, I have been reviewing CIS’ grant-making programs to improve how we serve our community, while also supporting regional initiatives such as research planning related to Ukraine. Together with the center's MIT-Brazil faculty director Brad Olsen, I secured a MITHIC [MIT Human Insight Collaboration] Connectivity grant to build an MIT Amazonia research community that connects MIT scholars with regional partners and strengthens collaboration across the Amazon. Finally, I launched the BRICS Lab to analyze transformations in global governance and have ongoing research on BRICS and food security and data centers in BRICS. 

Q: Tell us more about the BRICS Lab.

A: The BRICS countries comprise the majority of the world’s population and an expanding share of the global economy. [Originally comprising Brazil, Russia, India, and China, BRICS currently includes 11 nations.] As a group, they carry the collective weight to shape international rules, influence global markets, and redefine norms — yet the question remains: Will they use this power effectively? The BRICS Lab explores the implications of the bloc’s rise for international cooperation and its role in reshaping global politics. Our work focuses on three areas: the design and strategic use of informal groups like BRICS in world affairs; the coalition’s potential to address major challenges such as food security, climate change, and artificial intelligence; and the implications of U.S. policy toward BRICS for the future of multilateralism.

Q: What are the center’s biggest research priorities right now?

A: Our center was founded in response to rising geopolitical tensions and the urgent need for policy rooted in rigorous, evidence-based research. Since then, we have grown into a hub that combines interdisciplinary scholarship and actively engages with policymakers and the public. Today, as in our early years, the center brings together exceptional researchers with the ambition to address the world’s most pressing challenges in new and creative ways.

Our core focus spans security, development, and human dignity. Security studies have been a priority for the center, and our new nuclear security programming advances this work while training the next generation of scholars in this critical field. On the development front, our work has explored how societies manage diverse populations, navigate international migration, as well as engage with human rights and the changing patterns of regime dynamics.

We are pursuing new research in three areas. First, on climate change, we seek to understand how societies confront environmental risks and harms, from insurance to water and food security in the international context. Second, we examine shifting patterns of global governance as rising powers set new agendas and take on greater responsibilities in the international system. Finally, we are initiating research on the impact of AI — how it reshapes governance across international relations, what is the role of AI corporations, and how AI-related risks can be managed.

As we approach our 75th anniversary in 2026, we are excited to bring researchers together to spark bold ideas that open new possibilities for the future.

Saab 340 becomes permanent flight-test asset at Lincoln Laboratory

MIT Latest News - Tue, 09/30/2025 - 8:00am

A Saab 340 aircraft recently became a permanent fixture of the fleet at the MIT Lincoln Laboratory Flight Test Facility, which supports R&D programs across the lab. 

Over the past five years, the facility leased and operated the twin-engine turboprop, once commercially used for the regional transport of passengers and cargo. During this time, staff modified the aircraft with a suite of radar, sensing, and communications capabilities. Transitioning the aircraft from a leased to a government-owned asset retains the aircraft's capabilities for present and future R&D in support of national security and reduces costs for Lincoln Laboratory sponsors. 

With the acquisition of the Saab, the Flight Test Facility currently maintains five government-owned aircraft — including three Gulfstream IVs and a Cessna 206 — as well as a leased Twin Otter, all housed on Hanscom Air Force Base, just over a mile from the laboratory's main campus.

"Of all our aircraft, the Saab is the most multi-mission-capable," says David Culbertson, manager of the Flight Test Facility. "It's highly versatile and adaptable, like a Swiss Army knife. Researchers from across the laboratory have conducted flight tests on the Saab to develop all kinds of technologies for national security."

For example, the Saab was modified to host the Airborne Radar Testbed (ARTB), a high-performance radar system based on a computer-controlled array of antennas that can be electronically steered (instead of physically moved) in different directions. With the ARTB, researchers have matured innovative radio-frequency technology; prototyped advanced system concepts; and demonstrated concepts of operation for intelligence, surveillance, and reconnaissance (ISR) missions. With its open-architecture design and compliance with open standards, the ARTB can easily be reconfigured to suit specific R&D needs.

"The Saab has enabled us to rapidly prototype and mature the complex system-of-systems solutions needed to realize critical warfighter capabilities," says Ramu Bhagavatula, an assistant leader of the laboratory's Embedded and Open Systems Group. "Recently, the Saab participated in a major national exercise as a surrogate multi-INT [intelligence] ISR platform. We demonstrated machine-to-machine cueing of our multi-INT payload to automatically recognize targets designated by an operational U.S. Air Force platform. The Saab's flexibility was key to integrating diverse technologies to develop this important capability."

In anticipation of the expiration of the Saab's lease, the Flight Test Facility and Financial Services Department conducted an extensive analysis of alternatives. Comparing the operational effectiveness, suitability, and life-cycle cost of various options, this analysis determined that the optimal solution for the laboratory and the government was to purchase the aircraft.

"Having the Saab in our permanent inventory allows research groups from across the laboratory to continuously leverage each other's test beds and expertise," says Linda McCabe, a project manager in the laboratory's Communication Networks and Analysis Group. "In addition, we can invest in long-term infrastructure updates that will benefit a wide range of users. For instance, my group helped obtain authorizations from various agencies to equip the Saab with Link 16, a secure communications network used by NATO and its allies to share tactical information."

The Saab acquisition is part of a larger recapitalization effort at the Flight Test Facility to support emerging technology development for years to come. This 10-year effort, slated for completion in 2026, is retiring aging, obsolete aircraft and replacing them with newer platforms that will be more cost-effective to maintain, easier to integrate rapidly prototyped systems into, and able to operate under expanded flight envelopes (the performance limits within which an aircraft can safely fly, defined by parameters such as speed, altitude, and maneuverability).

Details of a Scam

Schneier on Security - Tue, 09/30/2025 - 7:06am

Longtime Crypto-Gram readers know that I collect personal experiences of people being scammed. Here’s an almost:

Then he added, “Here at Chase, we’ll never ask for your personal information or passwords.” On the contrary, he gave me more information—two “cancellation codes” and a long case number with four letters and 10 digits.

That’s when he offered to transfer me to his supervisor. That simple phrase, familiar from countless customer-service calls, draped a cloak of corporate competence over this unfolding drama. His supervisor. I mean, would a scammer have a supervisor?...

Shutdown threatens to delay Zeldin’s climate rule rollback

ClimateWire News - Tue, 09/30/2025 - 6:19am
The EPA administrator's aggressive timeline would be undermined by the agency shuttering for a prolonged period.

Why Trump’s coal revival may be short-lived

ClimateWire News - Tue, 09/30/2025 - 6:18am
The administration's plan to boost coal may provide modest relief to existing plants but likely won't prompt new investment, analysts say.

Maine wins early victory in climate lawsuit against oil companies

ClimateWire News - Tue, 09/30/2025 - 6:17am
A federal judge rebuffed the fossil fuel industry's bid to move the case to a more favorable court.

Dems: White House skirted law to cut NASA’s budget

ClimateWire News - Tue, 09/30/2025 - 6:16am
The Trump administration has sought to purge climate programs from NASA and other federal agencies.

DOE: No ban on ‘climate change’ or ‘emissions’ in communications

ClimateWire News - Tue, 09/30/2025 - 6:15am
The department insists it's not blocking staff using certain terms, even as the administration moves to gut climate regs and refute the science.

Oregon legislators raise gas tax, add EV fee

ClimateWire News - Tue, 09/30/2025 - 6:15am
Democrats criticized the bill for not including subsidies for e-bikes or electric vehicles.

Red states probe tech companies’ renewable energy claims

ClimateWire News - Tue, 09/30/2025 - 6:14am
The states allege that tech giants like Meta and Amazon are misleading consumers about their energy use.

Florida cities and counties sue over sweeping hurricane emergency law

ClimateWire News - Tue, 09/30/2025 - 6:13am
The lawsuit filed in circuit court in Tallahassee asserts that the new law is “the largest incursion into local home rule authority” since 1968.

Nations rethink plans for Brazil climate summit as costs soar

ClimateWire News - Tue, 09/30/2025 - 6:11am
Accommodations in Belém are scarce. Sky-high prices have leaders from developing countries considering scaling back their presence.

Suriname pledges to shield 90% of its forests

ClimateWire News - Tue, 09/30/2025 - 6:10am
Scientists say Suriname is one of only three countries worldwide that absorb more carbon dioxide than they emit.

Deadly tropical storm from former typhoon rips through Vietnam

ClimateWire News - Tue, 09/30/2025 - 6:09am
Nine of the 12 reported deaths occurred in the scenic province of Ninh Binh, where strong winds collapsed houses.

MIT joins in constructing the Giant Magellan Telescope

MIT Latest News - Tue, 09/30/2025 - 6:00am

The following article is adapted from a joint press release issued today by MIT and the Giant Magellan Telescope.

MIT is lending its support to the Giant Magellan Telescope, joining the international consortium to advance the $2.6 billion observatory in Chile. The Institute’s participation, enabled by a transformational gift from philanthropists Phillip (Terry) Ragon ’72 and Susan Ragon, adds to the momentum to construct the Giant Magellan Telescope, whose 25.4-meter aperture will have five times the light-collecting area and up to 200 times the power of existing observatories.

“As philanthropists, Terry and Susan have an unerring instinct for finding the big levers: those interventions that truly transform the scientific landscape,” says MIT President Sally Kornbluth. “We saw this with their founding of the Ragon Institute, which pursues daring approaches to harnessing the immune system to prevent and cure human diseases. With today’s landmark gift, the Ragons enable an equally lofty mission to better understand the universe — and we could not be more grateful for their visionary support."

MIT will be the 16th member of the international consortium advancing the Giant Magellan Telescope and the 10th participant based in the United States. Together, the consortium has invested $1 billion in the observatory — the largest-ever private investment in ground-based astronomy. The Giant Magellan Telescope is already 40 percent under construction, with major components being designed and manufactured across 36 U.S. states.

“MIT is honored to join the consortium and participate in this exceptional scientific endeavor,” says Ian A. Waitz, MIT’s vice president for research. “The Giant Magellan Telescope will bring tremendous new capabilities to MIT astronomy and to U.S. leadership in fundamental science. The construction of this uniquely powerful telescope represents a vital private and public investment in scientific excellence for decades to come.”

MIT brings to the consortium powerful scientific capabilities and a legacy of astronomical excellence. MIT’s departments of Physics and of Earth, Atmospheric and Planetary Sciences, and the MIT Kavli Institute for Astrophysics and Space Research, are internationally recognized for research in exoplanets, cosmology, and environments of extreme gravity, such as black holes and compact binary stars. MIT’s involvement will strengthen the Giant Magellan Telescope’s unique capabilities in high-resolution spectroscopy, adaptive optics, and the search for life beyond Earth. It also deepens a long-standing scientific relationship: MIT is already a partner in the existing twin Magellan Telescopes at Las Campanas Observatory in Chile — one of the most scientifically valuable observing sites on Earth, and the same site where the Giant Magellan Telescope is now under construction.

“Since Galileo’s first spyglass, the world’s largest telescope has doubled in aperture every 40 to 50 years,” says Robert A. Simcoe, director of the MIT Kavli Institute and the Francis L. Friedman Professor of Physics. “Each generation’s leading instruments have resolved important scientific questions of the day and then surprised their builders with new discoveries not yet even imagined, helping humans understand our place in the universe. Together with the Giant Magellan Telescope, MIT is helping to realize our generation’s contribution to this lineage, consistent with our mission to advance the frontier of fundamental science by undertaking the most audacious and advanced engineering challenges.”

Contributing to the national strategy

MIT’s support comes at a pivotal time for the observatory. In June 2025, the National Science Foundation (NSF) advanced the Giant Magellan Telescope into its Final Design Phase, one of the final steps before it becomes eligible for federal construction funding. To demonstrate readiness and a strong commitment to U.S. leadership, the consortium offered to privately fund this phase, which is traditionally supported by the NSF.

MIT’s investment is an integral part of the national strategy to secure U.S. access to the next generation of research facilities known as “extremely large telescopes.” The Giant Magellan Telescope is a core partner in the U.S. Extremely Large Telescope Program, the nation’s top priority in astronomy. The National Academies’ Astro2020 Decadal Survey called the program “absolutely essential if the United States is to maintain a position as a leader in ground-based astronomy.” This long-term strategy also includes the recently commissioned Vera C. Rubin Observatory in Chile. Rubin is scanning the sky to detect rare, fast-changing cosmic events, while the Giant Magellan Telescope will provide the sensitivity, resolution, and spectroscopic instruments needed to study them in detail. Together, these Southern Hemisphere observatories will give U.S. scientists the tools they need to lead 21st-century astrophysics.

“Without direct access to the Giant Magellan Telescope, the U.S. risks falling behind in fundamental astronomy, as Rubin’s most transformational discoveries will be utilized by other nations with access to their own ‘extremely large telescopes’ under development,” says Walter Massey, board chair of the Giant Magellan Telescope.

MIT’s participation brings the United States a step closer to completing the promise of this powerful new observatory on a globally competitive timeline. With federal construction funding, it is expected that the observatory could reach 90 percent completion in less than two years and become operational by the 2030s.

“MIT brings critical expertise and momentum at a time when global leadership in astronomy hangs in the balance,” says Robert Shelton, president of the Giant Magellan Telescope. “With MIT, we are not just adding a partner; we are accelerating a shared vision for the future and reinforcing the United States’ position at the forefront of science.”

Other members of the Giant Magellan Telescope consortium include the University of Arizona, Carnegie Institution for Science, The University of Texas at Austin, Korea Astronomy and Space Science Institute, University of Chicago, São Paulo Research Foundation (FAPESP), Texas A&M University, Northwestern University, Harvard University, Astronomy Australia Ltd., Australian National University, Smithsonian Institution, Weizmann Institute of Science, Academia Sinica Institute of Astronomy and Astrophysics, and Arizona State University.

A boon for astrophysics research and education

Access to the world’s best optical telescopes is a critical resource for MIT researchers. More than 150 individual science programs at MIT have relied on major astronomical observatories in the past three years, engaging faculty, researchers, and students in investigations into the marvels of the universe. Recent research projects have included chemical studies of the universe’s oldest stars, led by Professor Anna Frebel; spectroscopy of stars shredded by dormant black holes, led by Professor Erin Kara; and measurements of a white dwarf teetering on the precipice of a black hole, led by Professor Kevin Burdge. 

“Over many decades, researchers at the MIT Kavli Institute have used unparalleled instruments to discover previously undetected cosmic phenomena from both ground-based observations and spaceflight missions,” says Nergis Mavalvala, dean of the MIT School of Science and the Curtis (1963) and Kathleen Marble Professor of Astrophysics. “I have no doubt our brilliant colleagues will carry on that tradition with the Giant Magellan Telescope, and I can’t wait to see what they will discover next.”

The Giant Magellan Telescope will also provide a platform for advanced R&D in remote sensing, creating opportunities to build custom infrared and optical spectrometers and high-speed imagers to further study our universe.

“One cannot have a leading physics program without a leading astrophysics program. Access to time on the Giant Magellan Telescope will ensure that future generations of MIT researchers will continue to work at the forefront of astrophysical discovery for decades to come,” says Deepto Chakrabarty, head of the MIT Department of Physics, the William A. M. Burden Professor in Astrophysics, and principal investigator at the MIT Kavli Institute. “Our institutional access will help attract and retain top researchers in astrophysics, planetary science, and advanced optics, and will give our PhD students and postdocs unrivaled educational opportunities.”

Protecting Access to the Law—and Beneficial Uses of AI

EFF: Updates - Tue, 09/30/2025 - 12:26am

As the first copyright cases concerning AI reach appeals courts, EFF wants to protect important, beneficial uses of this technology—including AI for legal research. That’s why we weighed in on the long-running case of Thomson Reuters v. ROSS Intelligence. This case raises at least two important issues: the use of (possibly) copyrighted material to train a machine learning AI system, and public access to legal texts.  

ROSS Intelligence was a legal research startup that built an AI-based tool for locating judges’ written opinions based on natural language queries—a competitor to ubiquitous legal research platforms like Lexis and Thomson Reuters’ Westlaw. To build its tool, ROSS hired another firm to read through thousands of the “West headnotes” that Thomson Reuters adds to the legal decisions it publishes, paraphrasing the individual legal conclusions (what lawyers call “holdings”) that the headnotes identified. ROSS used those paraphrases to train its tool. Importantly, the ROSS tool didn’t output any West headnotes, or even the paraphrases of those headnotes—it simply directed the user to the original judges’ decisions. Still, Thomson sued ROSS for copyright infringement, arguing that using the headnotes without permission was illegal.  

Early decisions in the suit were encouraging. EFF wrote about how the court allowed ROSS to bring an antitrust counterclaim against Thomson Reuters, letting them try to prove that Thomson was abusing monopoly power. And the trial judge initially ruled that ROSS’s use of the West headnotes was fair use under copyright law. 

The case then took turns for the worse. ROSS was unable to prove its antitrust claim. The trial judge issued a new opinion reversing his earlier decision and finding that ROSS’s use was not fair but rather infringed Thomson’s copyrights. And in the meantime, ROSS had gone out of business (though it continues to defend itself in court).  

The court’s new decision on copyright was particularly worrisome. It ruled that West headnotes—a few lines of text copying or summarizing a single legal conclusion from a judge’s written opinion—could be copyrighted, and that using them to train the ROSS tool was not fair use, in part because ROSS was a competitor to Thomson Reuters. And the court rejected ROSS’s attempt to avoid any illegal copying by using a “clean room” procedure often used in software development. The decision also threatens to limit the public’s access to legal texts. 

EFF weighed in with an amicus brief joined by the American Library Association, the Association of Research Libraries, the Internet Archive, Public Knowledge, and Public.Resource.Org. We argued that West headnotes are not copyrightable in the first place, since they simply restate individual points from judges’ opinions with no meaningful creative contributions. And even if copyright does attach to the headnotes, we argued, the source material is entirely factual statements about what the law is, and West’s contribution was minimal, so fair use should have tipped in ROSS’s favor. The trial judge had found that the factual nature of the headnotes favored ROSS, but dismissed this factor as unimportant, effectively writing it out of the law. 

This case is one of the first to touch on copyright and AI, and is likely to influence many of the other cases that are already pending (with more being filed all the time). That’s why we’re trying to help the appeals court get this one right. The law should encourage the creation of AI tools to digest and identify facts for use by researchers, including facts about the law. 

Synchronization of global peak river discharge since the 1980s

Nature Climate Change - Tue, 09/30/2025 - 12:00am

Nature Climate Change, Published online: 30 September 2025; doi:10.1038/s41558-025-02427-6

River floods that occur simultaneously in multiple locations can lead to higher damages than individual events. Here, the authors show that the likelihood of concurrent high river discharge has increased over the last decades.

Responding to the climate impact of generative AI

MIT Latest News - Tue, 09/30/2025 - 12:00am

In part 2 of our two-part series on generative artificial intelligence’s environmental impacts, MIT News explores some of the ways experts are working to reduce the technology’s carbon footprint.

The energy demands of generative AI are expected to continue increasing dramatically over the next decade.

For instance, an April 2025 report from the International Energy Agency predicts that the global electricity demand from data centers, which house the computing infrastructure to train and deploy AI models, will more than double by 2030, to around 945 terawatt-hours. While not all operations performed in a data center are AI-related, this total amount is slightly more than the energy consumption of Japan.

Moreover, an August 2025 analysis from Goldman Sachs Research forecasts that about 60 percent of the increasing electricity demands from data centers will be met by burning fossil fuels, increasing global carbon emissions by about 220 million tons. In comparison, driving a gas-powered car for 5,000 miles produces about 1 ton of carbon dioxide.

These statistics are staggering, but at the same time, scientists and engineers at MIT and around the world are studying innovations and interventions to mitigate AI’s ballooning carbon footprint, from boosting the efficiency of algorithms to rethinking the design of data centers.

Considering carbon emissions

Talk of reducing generative AI’s carbon footprint is typically centered on “operational carbon” — the emissions used by the powerful processors, known as GPUs, inside a data center. It often ignores “embodied carbon,” which are emissions created by building the data center in the first place, says Vijay Gadepally, senior scientist at MIT Lincoln Laboratory, who leads research projects in the Lincoln Laboratory Supercomputing Center.

Constructing and retrofitting a data center, built from tons of steel and concrete and filled with air conditioning units, computing hardware, and miles of cable, consumes a huge amount of carbon. In fact, the environmental impact of building data centers is one reason companies like Meta and Google are exploring more sustainable building materials. (Cost is another factor.)

Plus, data centers are enormous buildings — the world’s largest, the China Telecomm-Inner Mongolia Information Park, engulfs roughly 10 million square feet — with about 10 to 50 times the energy density of a normal office building, Gadepally adds. 

“The operational side is only part of the story. Some things we are working on to reduce operational emissions may lend themselves to reducing embodied carbon, too, but we need to do more on that front in the future,” he says.

Reducing operational carbon emissions

When it comes to reducing operational carbon emissions of AI data centers, there are many parallels with home energy-saving measures. For one, we can simply turn down the lights.

“Even if you have the worst lightbulbs in your house from an efficiency standpoint, turning them off or dimming them will always use less energy than leaving them running at full blast,” Gadepally says.

In the same fashion, research from the Supercomputing Center has shown that “turning down” the GPUs in a data center so they consume about three-tenths the energy has minimal impacts on the performance of AI models, while also making the hardware easier to cool.

Another strategy is to use less energy-intensive computing hardware.

Demanding generative AI workloads, such as training new reasoning models like GPT-5, usually need many GPUs working simultaneously. The Goldman Sachs analysis estimates that a state-of-the-art system could soon have as many as 576 connected GPUs operating at once.

But engineers can sometimes achieve similar results by reducing the precision of computing hardware, perhaps by switching to less powerful processors that have been tuned to handle a specific AI workload.

There are also measures that boost the efficiency of training power-hungry deep-learning models before they are deployed.

Gadepally’s group found that about half the electricity used for training an AI model is spent to get the last 2 or 3 percentage points in accuracy. Stopping the training process early can save a lot of that energy.

“There might be cases where 70 percent accuracy is good enough for one particular application, like a recommender system for e-commerce,” he says.

Researchers can also take advantage of efficiency-boosting measures.

For instance, a postdoc in the Supercomputing Center realized the group might run a thousand simulations during the training process to pick the two or three best AI models for their project.

By building a tool that allowed them to avoid about 80 percent of those wasted computing cycles, they dramatically reduced the energy demands of training with no reduction in model accuracy, Gadepally says.

Leveraging efficiency improvements

Constant innovation in computing hardware, such as denser arrays of transistors on semiconductor chips, is still enabling dramatic improvements in the energy efficiency of AI models.

Even though energy efficiency improvements have been slowing for most chips since about 2005, the amount of computation that GPUs can do per joule of energy has been improving by 50 to 60 percent each year, says Neil Thompson, director of the FutureTech Research Project at MIT’s Computer Science and Artificial Intelligence Laboratory and a principal investigator at MIT’s Initiative on the Digital Economy.

“The still-ongoing ‘Moore’s Law’ trend of getting more and more transistors on chip still matters for a lot of these AI systems, since running operations in parallel is still very valuable for improving efficiency,” says Thomspon.

Even more significant, his group’s research indicates that efficiency gains from new model architectures that can solve complex problems faster, consuming less energy to achieve the same or better results, is doubling every eight or nine months.

Thompson coined the term “negaflop” to describe this effect. The same way a “negawatt” represents electricity saved due to energy-saving measures, a “negaflop” is a computing operation that doesn’t need to be performed due to algorithmic improvements.

These could be things like “pruning” away unnecessary components of a neural network or employing compression techniques that enable users to do more with less computation.

“If you need to use a really powerful model today to complete your task, in just a few years, you might be able to use a significantly smaller model to do the same thing, which would carry much less environmental burden. Making these models more efficient is the single-most important thing you can do to reduce the environmental costs of AI,” Thompson says.

Maximizing energy savings

While reducing the overall energy use of AI algorithms and computing hardware will cut greenhouse gas emissions, not all energy is the same, Gadepally adds.

“The amount of carbon emissions in 1 kilowatt hour varies quite significantly, even just during the day, as well as over the month and year,” he says.

Engineers can take advantage of these variations by leveraging the flexibility of AI workloads and data center operations to maximize emissions reductions. For instance, some generative AI workloads don’t need to be performed in their entirety at the same time.

Splitting computing operations so some are performed later, when more of the electricity fed into the grid is from renewable sources like solar and wind, can go a long way toward reducing a data center’s carbon footprint, says Deepjyoti Deka, a research scientist in the MIT Energy Initiative.

Deka and his team are also studying “smarter” data centers where the AI workloads of multiple companies using the same computing equipment are flexibly adjusted to improve energy efficiency.

“By looking at the system as a whole, our hope is to minimize energy use as well as dependence on fossil fuels, while still maintaining reliability standards for AI companies and users,” Deka says.

He and others at MITEI are building a flexibility model of a data center that considers the differing energy demands of training a deep-learning model versus deploying that model. Their hope is to uncover the best strategies for scheduling and streamlining computing operations to improve energy efficiency.

The researchers are also exploring the use of long-duration energy storage units at data centers, which store excess energy for times when it is needed.

With these systems in place, a data center could use stored energy that was generated by renewable sources during a high-demand period, or avoid the use of diesel backup generators if there are fluctuations in the grid.

“Long-duration energy storage could be a game-changer here because we can design operations that really change the emission mix of the system to rely more on renewable energy,” Deka says.

In addition, researchers at MIT and Princeton University are developing a software tool for investment planning in the power sector, called GenX, which could be used to help companies determine the ideal place to locate a data center to minimize environmental impacts and costs.

Location can have a big impact on reducing a data center’s carbon footprint. For instance, Meta operates a data center in Lulea, a city on the coast of northern Sweden where cooler temperatures reduce the amount of electricity needed to cool computing hardware.

Thinking farther outside the box (way farther), some governments are even exploring the construction of data centers on the moon where they could potentially be operated with nearly all renewable energy.

AI-based solutions

Currently, the expansion of renewable energy generation here on Earth isn’t keeping pace with the rapid growth of AI, which is one major roadblock to reducing its carbon footprint, says Jennifer Turliuk MBA ’25, a short-term lecturer, former Sloan Fellow, and former practice leader of climate and energy AI at the Martin Trust Center for MIT Entrepreneurship.

The local, state, and federal review processes required for a new renewable energy projects can take years.

Researchers at MIT and elsewhere are exploring the use of AI to speed up the process of connecting new renewable energy systems to the power grid.

For instance, a generative AI model could streamline interconnection studies that determine how a new project will impact the power grid, a step that often takes years to complete.

And when it comes to accelerating the development and implementation of clean energy technologies, AI could play a major role.

“Machine learning is great for tackling complex situations, and the electrical grid is said to be one of the largest and most complex machines in the world,” Turliuk adds.

For instance, AI could help optimize the prediction of solar and wind energy generation or identify ideal locations for new facilities.

It could also be used to perform predictive maintenance and fault detection for solar panels or other green energy infrastructure, or to monitor the capacity of transmission wires to maximize efficiency.

By helping researchers gather and analyze huge amounts of data, AI could also inform targeted policy interventions aimed at getting the biggest “bang for the buck” from areas such as renewable energy, Turliuk says.

To help policymakers, scientists, and enterprises consider the multifaceted costs and benefits of AI systems, she and her collaborators developed the Net Climate Impact Score.

The score is a framework that can be used to help determine the net climate impact of AI projects, considering emissions and other environmental costs along with potential environmental benefits in the future.

At the end of the day, the most effective solutions will likely result from collaborations among companies, regulators, and researchers, with academia leading the way, Turliuk adds.

“Every day counts. We are on a path where the effects of climate change won’t be fully known until it is too late to do anything about it. This is a once-in-a-lifetime opportunity to innovate and make AI systems less carbon-intense,” she says.

A beacon of light

MIT Latest News - Mon, 09/29/2025 - 4:00pm

Placing a lit candle in a window to welcome friends and strangers is an old Irish tradition that took on greater significance when Mary Robinson was elected president of Ireland in 1990. At the time, Robinson placed a lamp in Áras an Uachtaráin — the official residence of Ireland’s presidents — noting that the Irish diaspora and all others are always welcome in Ireland. Decades later, a lit lamp remains in a window in Áras an Uachtaráin.

The symbolism of Robinson’s lamp was shared by Hashim Sarkis, dean of the MIT School of Architecture and Planning (SA+P), at the school’s graduation ceremony in May, where Robinson addressed the class of 2025. To replicate the generous intentions of Robinson’s lamp and commemorate her visit to MIT, Sarkis commissioned a unique lantern as a gift for Robinson. He commissioned an identical one for his office, which is in the front portico of MIT at 77 Massachusetts Ave.

“The lamp will welcome all citizens of the world to MIT,” says Sarkis.

No ordinary lantern

The bespoke lantern was created by Marcelo Coelho SM ’08, PhD ’12, director of the Design Intelligence Lab and associate professor of the practice in the Department of Architecture.

One of several projects in the Geoletric research at the Design Intelligence Lab, the lantern showcases the use of geopolymers as a sustainable material alternative for embedded computers and consumer electronics.

“The materials that we use to make computers have a negative impact on climate, so we’re rethinking how we make products with embedded electronics — such as a lamp or lantern — from a climate perspective,” says Coelho.

Consumer electronics rely on materials that are high in carbon emissions and difficult to recycle. As the demand for embedded computing increases, so too does the need for alternative materials that have a reduced environmental impact while supporting electronic functionality.

The Geolectric lantern advances the formulation and application of geopolymers — a class of inorganic materials that form covalently bonded, non-crystalline networks. Unlike traditional ceramics, geopolymers do not require high-temperature firing, allowing electronic components to be embedded seamlessly during production.

Geopolymers are similar to ceramics, but have a lower carbon footprint and present a sustainable alternative for consumer electronics, product design, and architecture. The minerals Coelho uses to make the geopolymers — aluminum silicate and sodium silicate — are those regularly used to make ceramics.

“Geopolymers aren’t particularly new, but are becoming more popular,” says Coelho. “They have high strength in both tension and compression, superior durability, fire resistance, and thermal insulation. Compared to concrete, geopolymers don’t release carbon dioxide. Compared to ceramics, you don’t have to worry about firing them. What’s even more interesting is that they can be made from industrial byproducts and waste materials, contributing to a circular economy and reducing waste.”

The lantern is embedded with custom electronics that serve as a proximity and touch sensor. When a hand is placed over the top, light shines down the glass tubes.

The timeless design of the Geoelectric lantern — minimalist, composed of natural materials — belies its future-forward function. Coelho’s academic background is in fine arts and computer science. Much of his work, he says, “bridges these two worlds.”

Working at the Design Intelligence Lab with Coelho on the lanterns are Jacob Payne, a graduate architecture student, and Jean-Baptiste Labrune, a research affiliate.

A light for MIT

A few weeks before commencement, Sarkis saw the Geoelectric lantern in Palazzo Diedo Berggruen Arts and Culture in Venice, Italy. The exhibition, a collateral event of the Venice Biennale’s 19th International Architecture Exhibition, featured the work of 40 MIT architecture faculty.

The sustainability feature of Geolectric is the key reason Sarkis regarded the lantern as the perfect gift for Robinson. After her career in politics, Robinson founded the Mary Robinson Foundation — Climate Justice, an international center addressing the impacts of climate change on marginalized communities.

The third iteration of Geolectric for Sarkis’ office is currently underway. While the lantern was a technical prototype and an opportunity to showcase his lab’s research, Coelho — an immigrant from Brazil — was profoundly touched by how Sarkis created the perfect symbolism to both embody the welcoming spirit of the school and honor President Robinson.

“When the world feels most fragile, we need to urgently find sustainable and resilient solutions for our built environment. It’s in the darkest times when we need light the most,” says Coelho. 

Pages