MIT Latest News

Subscribe to MIT Latest News feed
MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community.
Updated: 1 hour 49 min ago

The unique, mathematical shortcuts language models use to predict dynamic scenarios

Mon, 07/21/2025 - 8:00am

Let’s say you’re reading a story, or playing a game of chess. You may not have noticed, but each step of the way, your mind kept track of how the situation (or “state of the world”) was changing. You can imagine this as a sort of sequence of events list, which we use to update our prediction of what will happen next.

Language models like ChatGPT also track changes inside their own “mind” when finishing off a block of code or anticipating what you’ll write next. They typically make educated guesses using transformers — internal architectures that help the models understand sequential data — but the systems are sometimes incorrect because of flawed thinking patterns. Identifying and tweaking these underlying mechanisms helps language models become more reliable prognosticators, especially with more dynamic tasks like forecasting weather and financial markets.

But do these AI systems process developing situations like we do? A new paper from researchers in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Department of Electrical Engineering and Computer Science shows that the models instead use clever mathematical shortcuts between each progressive step in a sequence, eventually making reasonable predictions. The team made this observation by going under the hood of language models, evaluating how closely they could keep track of objects that change position rapidly. Their findings show that engineers can control when language models use particular workarounds as a way to improve the systems’ predictive capabilities.

Shell games

The researchers analyzed the inner workings of these models using a clever experiment reminiscent of a classic concentration game. Ever had to guess the final location of an object after it’s placed under a cup and shuffled with identical containers? The team used a similar test, where the model guessed the final arrangement of particular digits (also called a permutation). The models were given a starting sequence, such as “42135,” and instructions about when and where to move each digit, like moving the “4” to the third position and onward, without knowing the final result.

In these experiments, transformer-based models gradually learned to predict the correct final arrangements. Instead of shuffling the digits based on the instructions they were given, though, the systems aggregated information between successive states (or individual steps within the sequence) and calculated the final permutation.

One go-to pattern the team observed, called the “Associative Algorithm,” essentially organizes nearby steps into groups and then calculates a final guess. You can think of this process as being structured like a tree, where the initial numerical arrangement is the “root.” As you move up the tree, adjacent steps are grouped into different branches and multiplied together. At the top of the tree is the final combination of numbers, computed by multiplying each resulting sequence on the branches together.

The other way language models guessed the final permutation was through a crafty mechanism called the “Parity-Associative Algorithm,” which essentially whittles down options before grouping them. It determines whether the final arrangement is the result of an even or odd number of rearrangements of individual digits. Then, the mechanism groups adjacent sequences from different steps before multiplying them, just like the Associative Algorithm.

“These behaviors tell us that transformers perform simulation by associative scan. Instead of following state changes step-by-step, the models organize them into hierarchies,” says MIT PhD student and CSAIL affiliate Belinda Li SM ’23, a lead author on the paper. “How do we encourage transformers to learn better state tracking? Instead of imposing that these systems form inferences about data in a human-like, sequential way, perhaps we should cater to the approaches they naturally use when tracking state changes.”

“One avenue of research has been to expand test-time computing along the depth dimension, rather than the token dimension — by increasing the number of transformer layers rather than the number of chain-of-thought tokens during test-time reasoning,” adds Li. “Our work suggests that this approach would allow transformers to build deeper reasoning trees.”

Through the looking glass

Li and her co-authors observed how the Associative and Parity-Associative algorithms worked using tools that allowed them to peer inside the “mind” of language models. 

They first used a method called “probing,” which shows what information flows through an AI system. Imagine you could look into a model’s brain to see its thoughts at a specific moment — in a similar way, the technique maps out the system’s mid-experiment predictions about the final arrangement of digits.

A tool called “activation patching” was then used to show where the language model processes changes to a situation. It involves meddling with some of the system’s “ideas,” injecting incorrect information into certain parts of the network while keeping other parts constant, and seeing how the system will adjust its predictions.

These tools revealed when the algorithms would make errors and when the systems “figured out” how to correctly guess the final permutations. They observed that the Associative Algorithm learned faster than the Parity-Associative Algorithm, while also performing better on longer sequences. Li attributes the latter’s difficulties with more elaborate instructions to an over-reliance on heuristics (or rules that allow us to compute a reasonable solution fast) to predict permutations.

“We’ve found that when language models use a heuristic early on in training, they’ll start to build these tricks into their mechanisms,” says Li. “However, those models tend to generalize worse than ones that don’t rely on heuristics. We found that certain pre-training objectives can deter or encourage these patterns, so in the future, we may look to design techniques that discourage models from picking up bad habits.”

The researchers note that their experiments were done on small-scale language models fine-tuned on synthetic data, but found the model size had little effect on the results. This suggests that fine-tuning larger language models, like GPT 4.1, would likely yield similar results. The team plans to examine their hypotheses more closely by testing language models of different sizes that haven’t been fine-tuned, evaluating their performance on dynamic real-world tasks such as tracking code and following how stories evolve.

Harvard University postdoc Keyon Vafa, who was not involved in the paper, says that the researchers’ findings could create opportunities to advance language models. “Many uses of large language models rely on tracking state: anything from providing recipes to writing code to keeping track of details in a conversation,” he says. “This paper makes significant progress in understanding how language models perform these tasks. This progress provides us with interesting insights into what language models are doing and offers promising new strategies for improving them.”

Li wrote the paper with MIT undergraduate student Zifan “Carl” Guo and senior author Jacob Andreas, who is an MIT associate professor of electrical engineering and computer science and CSAIL principal investigator. Their research was supported, in part, by Open Philanthropy, the MIT Quest for Intelligence, the National Science Foundation, the Clare Boothe Luce Program for Women in STEM, and a Sloan Research Fellowship.

The researchers presented their research at the International Conference on Machine Learning (ICML) this week.

What Americans actually think about taxes

Mon, 07/21/2025 - 12:00am

Doing your taxes can feel like a very complicated task. Even so, it might be less intricate than trying to make sense of what people think about taxes.

Several years ago, MIT political scientist Andrea Campbell undertook an expansive research project to understand public opinion about taxation. Her efforts have now reached fruition, in a new book uncovering many complexities about attitudes toward taxes. Those complexities include a central tension: In the U.S., most people say they support the principle of progressive taxation — in which higher earners pay higher shares of their income. Yet people also say they prefer specific forms of taxes that are regressive, hitting lower- and middle-income earners relatively harder.

For instance, state sales taxes are considered regressive, since people who make less money spend a larger percentage of their incomes, meaning sales taxes eat up a larger proportion of their earnings. But a substantial portion of the public still finds them to be fair, partly because the wealthy cannot wriggle out of them.

“At an abstract or conceptual level, people say they like progressive tax systems more than flat or regressive tax systems,” Campbell says. “But when you look at public attitudes toward specific taxes, people’s views flip upside down. People say federal and state income taxes are unfair, but they say sales taxes, which are very regressive, are fair. Their attitudes on individual taxes are the opposite of what their overall commitments are.”

Now Campbell analyzes these issues in detail in her book, “Taxation and Resentment,” just published by Princeton University Press. Campbell is the Arthur and Ruth Sloan Professor of Political Science at MIT and a former head of MIT’s  Department of Political Science.

Filling out the record

Campbell originally planned “Taxation and Resentment” as a strictly historically-oriented look at the subject. But the absence of any one book compiling public-opinion data in this area was striking. So, she assembled data going back to the end of World War II, and even designed and ran a couple of her own public research surveys, which help undergird the book’s numbers.

“Political scientists write a lot about public attitudes toward spending in the United States, but not so much about attitudes toward taxes,” Campbell says. “The public-opinion record is very thin.”

The complexities of U.S. public opinion on taxes are plainly linked to the presence of numerous forms of taxes, including federal and state income taxes, sales taxes, payroll taxes, estate taxes, and capital gains taxes. The best-known, of course, is the federal income tax, whose quirks and loopholes seem to irk citizens.

“That really seizes people’s imaginations,” Campbell says. “Keeping the focus on federal income tax has been a clever strategy among those who want to cut it. People think it’s unfair because they look at all the tax breaks the rich get and think, ‘I don’t have access to those.’ Those breaks increase complexity, undermine people’s knowledge, heighten their anger, and of course are in there because they help rich people pay less. So, there ends up being a cycle.”

That same sense of unfairness does not translate to all other forms of taxation, however. Large majorities of people have supported lowering the estate tax, for example, even though the threshold at which the federal estate tax kicks in — $13.5 million — applies to very few families.

Then too, the public seems to perceive sales taxes as being fair because of the simplicity and lack of loopholes — an understandable view, but one that ignores the way that state sales taxes, as opposed to state income taxes, place a bigger burden on middle-class and lower-income workers.

“A regressive tax like a sales tax is more difficult to comprehend,” Campbell says. “We all pay the same rate, so it seems like a flat tax, but as your income goes up, the bite of that tax goes down. And that’s just very difficult for people to understand.”

Overall, as Campbell details, income levels do not have huge predictive value when it comes to tax attitudes. Party affiliation also has less impact than many people might suspect — Democrats and Republicans differ on taxes, though not as much, in some ways, as political independents, who often have the most anti-tax views of all.

Meanwhile, Campbell finds, white Americans with heightened concerns about redistribution of public goods among varying demographic groups are more opposed to taxes than those who do not share those redistribution concerns. And Black and Hispanic Americans, who may wind up on the short end of regressive policies, also express significantly anti-tax perspectives, albeit while expressing more support for the state functions funded by taxation.

“There are so many factors and components of public opinion around taxes,” Campbell says. “Many political and demographic groups have their own reasons for disliking the status quo.”

How much does public opinion matter?

The research in “Taxation and Resentment” will be of high value to many kinds of scholars. However, as Campbell notes, political scientists do not have consensus about how much public opinion influences policy. Some experts contend that donors and lobbyists essentially determine policy while the larger public is ignored. But Campbell does not agree that public sentiment amounts to nothing. Consider, she says, the vigorous and successful public campaign to lower the estate tax in the first decade of the 2000s.

“If public opinion doesn’t matter, then why were there these PR campaigns to try to convince people the estate tax was bad for small businesses, farmers, and other groups?” Campbell asks. “Clearly it’s because public opinion does matter. It’s far easier to get these policies implemented if the public is on your side than if the public is in opposition. Public opinion is not the only factor in policymaking, but it’s a contributing factor.”

To be sure, even in the formation of public opinion, there are complexities and nuance, as Campbell notes in the book. A system of progressive taxation means the people taxed at the highest rate are the most motivated to oppose the system — and may heavily influence public opinion, in a top-down manner.

Scholars in the field have praised “Taxation and Resentment.” Martin Gilens, chair of the Department of Public Policy at the University of California at Los Angeles, has called it an “important and very welcome addition to the literature on public attitudes about public policies … with rich and often unexpected findings.” Vanessa Williamson, a senior fellow at the Brookings Institution, has said the book is “essential reading for anyone who wants to understand what Americans actually think about taxes. The scope of the data Campbell brings to bear on this question is unparalleled, and the depth of her analysis of public opinion across time and demography is a monumental achievement.”

For her part, Campbell says she hopes people in a variety of groups will read the book — including policymakers, scholars in multiple fields, and students. Certainly, she thinks, after studying the issue, more people could stand to know more about taxes.

“The tax system is complex,” Campbell says, “and people don’t always understand their own stakes. There is often a fog surrounding taxes.”

MIT launches a “moonshot for menstruation science”

Fri, 07/18/2025 - 9:50am

The MIT Health and Life Sciences Collaborative (MIT HEALS) has announced the establishment of the Fairbairn Menstruation Science Fund, supporting a bold, high-impact initiative designed to revolutionize women’s health research.

Established through a gift from Emily and Malcolm Fairbairn, the fund will advance groundbreaking research on the function of the human uterus and its impact on sex-based differences in human immunology that contribute to gynecological disorders such as endometriosis, as well as other chronic systemic inflammatory diseases that disproportionately affect women, such as Lyme disease and lupus. The Fairbairns, based in the San Francisco Bay Area, have committed $10 million, with a call to action for an additional $10 million in matching funds.

“I’m deeply grateful to Emily and Malcolm Fairbairn for their visionary support of menstruation science at MIT. For too long, this area of research has lacked broad scientific investment and visibility, despite its profound impact on the health and lives of over half the population,” says Anantha P. Chandrakasan, MIT provost who was chief innovation and strategy officer and dean of engineering at the time of the gift, and Vannevar Bush Professor of Electrical Engineering and Computer Science.

Chandrakasan adds: “Thanks to groundbreaking work from researchers like Professor Linda Griffith and her team at the MIT Center for Gynepathology Research (CGR), we have an opportunity to advance our understanding and address critical challenges in menstruation science.”

Griffith, professor of biological and mechanical engineering and director of CGR, says the Fairbairn Fund will permit the illumination of “the enormous sex-based differences in human immunity” and advance next-generation drug-discovery technologies.

One main thrust of the new initiative will further the development of “organs on chips,” living models of patients. Using living cells or tissues, such devices allow researchers to replicate and experiment with interactions that can occur in the body. Griffith and an interdisciplinary team of researchers have engineered a powerful microfluidic platform that supports chips that foster growth of tissues complete with blood vessels and circulating immune cells. The technology was developed for building endometriosis lesions from individual patients with known clinical characteristics. The chip allows the researchers to do preclinical testing of drugs on the human patient-derived endometriosis model rather than on laboratory animals, which often do not menstruate naturally and whose immune systems function differently than that of humans.

The Fairbairn Fund will build the infrastructure for a “living patient avatar” facility to develop such physiomimetic models for all kinds of health conditions.

“We acknowledge that there are some big-picture phenomenological questions that one can study in animals, but human immunology is so very different,” Griffith says. “Pharma and biotech realize that we need living models of patients and the computational models of carefully curated patient data if we are to move into greater success in clinical trials.”

The computational models of patient data that Griffith refers to are a key element in choosing how to design the patient avatars and determine which therapeutics to test on them. For instance, by using systems biology analysis of inflammation in patient abdominal fluid, Griffith and her collaborators identified an intracellular enzyme called jun kinase (JNK). They are now working with a biotech company to test specific inhibitors of JNK in their model. Griffith has also collaborated with Michal “Mikki” Tal, a principal scientist in MIT’s Department of Biological Engineering, on investigating a possible link between prior infection, such as by the Lyme-causing bacterium Borrelia, and a number of chronic inflammatory diseases in women. Automating assays of patient samples for higher throughput could systematically speed the generation of hypotheses guiding the development of patient model experimentation.

“This fund is catalytic,” Griffith says. “Industry and government, along with other foundations, will invest if the foundational infrastructure exists. They want to employ the technologies, but it is hard to get them developed to the point they are proven to be useful. This gets us through that difficult part of the journey.”

The fund will also support public engagement efforts to reduce stigma around menstruation and neglect of such conditions as abnormal uterine bleeding and debilitating anemia, endometriosis, and polycystic ovary syndrome — and in general bring greater attention to women’s health research. Endometriosis, for instance, in which tissue that resembles the uterine lining starts growing outside the uterus and causes painful inflammation, affects one in 10 women. It often goes undiagnosed for years, and can require repeated surgeries to remove its lesions. Meanwhile, little is known about what causes it, how to prevent it, or what could effectively stop it.

Women’s health research could further advance in many areas of medicine beyond conditions that disproportionately affect females. Griffith points out that the uterus, which sheds and regenerates its lining every month, demonstrates “scarless healing” that could warrant investigation. Also, deepened study of the uterus could shed light on immune tolerance for transplants, given that in a successful pregnancy an implanted fetus is not rejected, despite containing foreign material from the biological father.

For Emily Fairbairn, the fund is a critical step toward major advances in an often-overlooked area of medicine.

“My mission is to support intellectually honest, open-minded scientists who embrace risk, treat failure as feedback, and remain committed to discovery over dogma. This fund is a direct extension of that philosophy. It’s designed to fuel research into the biological realities of diseases that remain poorly understood, frequently dismissed, or disproportionately misdiagnosed in women,” Fairbairn says. “I’ve chosen to make this gift to MIT because Linda Griffith exemplifies the rare combination of scientific integrity and bold innovation — qualities essential for tackling the most neglected challenges in medicine.”

Fairbairn also refers to Griffith collaborator Michal Tal as being “deeply inspiring.”

“Her work embodies what’s possible when scientific excellence meets institutional courage. It is this spirit — bold, rigorous, and fearless — that inspired this gift and fuels our hope for the future of women’s health,” she says.

Fairbairn, who has suffered from both Lyme disease and endometriosis that required multiple surgeries, originally directed her philanthropy, including previous gifts to MIT, toward the study of Lyme disease and associated infections.

“My own experience with both Lyme and endometriosis deepened my conviction that science must better account for how female physiology, genetics, and psychology differ from men’s,” she says. “MIT stands out for treating women’s health not as a niche, but as a frontier. The Institute’s willingness to bridge immunology, neurobiology, bioengineering, and data science — alongside its development of cutting-edge platforms like human chips — offers a rare and necessary seriousness of purpose.”

For her part, Griffith refers to Fairbairn as “a citizen scientist who inspires us daily.”

“Her tireless advocacy for patients, especially women, who are dismissed and gas-lit, is priceless,” Griffith adds. “Emily has made me a better scientist, in service of humanity.”

Model predicts long-term effects of nuclear waste on underground disposal systems

Fri, 07/18/2025 - 12:00am

As countries across the world experience a resurgence in nuclear energy projects, the questions of where and how to dispose of nuclear waste remain as politically fraught as ever. The United States, for instance, has indefinitely stalled its only long-term underground nuclear waste repository. Scientists are using both modeling and experimental methods to study the effects of underground nuclear waste disposal and ultimately, they hope, build public trust in the decision-making process.

New research from scientists at MIT, Lawrence Berkeley National Lab, and the University of Orléans makes progress in that direction. The study shows that simulations of underground nuclear waste interactions, generated by new, high-performance-computing software, aligned well with experimental results from a research facility in Switzerland.

The study, which was co-authored by MIT PhD student Dauren Sarsenbayev and Assistant Professor Haruko Wainwright, along with Christophe Tournassat and Carl Steefel, appears in the journal PNAS.

“These powerful new computational tools, coupled with real-world experiments like those at the Mont Terri research site in Switzerland, help us understand how radionuclides will migrate in coupled underground systems,” says Sarsenbayev, who is first author of the new study.

The authors hope the research will improve confidence among policymakers and the public in the long-term safety of underground nuclear waste disposal.

“This research — coupling both computation and experiments — is important to improve our confidence in waste disposal safety assessments,” says Wainwright. “With nuclear energy re-emerging as a key source for tackling climate change and ensuring energy security, it is critical to validate disposal pathways.”

Comparing simulations with experiments

Disposing of nuclear waste in deep underground geological formations is currently considered the safest long-term solution for managing high-level radioactive waste. As such, much effort has been put into studying the migration behaviors of radionuclides from nuclear waste within various natural and engineered geological materials.

Since its founding in 1996, the Mont Terri research site in northern Switzerland has served as an important test bed for an international consortium of researchers interested in studying materials like Opalinus clay — a thick, water-tight claystone abundant in the tunneled areas of the mountain.

“It is widely regarded as one of the most valuable real-world experiment sites because it provides us with decades of datasets around the interactions of cement and clay, and those are the key materials proposed to be used by countries across the world for engineered barrier systems and geological repositories for nuclear waste,” explains Sarsenbayev.

For their study, Sarsenbayev and Wainwright collaborated with co-authors Tournassat and Steefel, who have developed high-performance computing software to improve modeling of interactions between the nuclear waste and both engineered and natural materials.

To date, several challenges have limited scientists’ understanding of how nuclear waste reacts with cement-clay barriers. For one thing, the barriers are made up of irregularly mixed materials deep underground. Additionally, the existing class of models commonly used to simulate radionuclide interactions with cement-clay do not take into account electrostatic effects associated with the negatively charged clay minerals in the barriers.

Tournassat and Steefel’s new software accounts for electrostatic effects, making it the only one that can simulate those interactions in three-dimensional space. The software, called CrunchODiTi, was developed from established software known as CrunchFlow and was most recently updated this year. It is designed to be run on many high-performance computers at once in parallel.

For the study, the researchers looked at a 13-year-old experiment, with an initial focus on cement-clay rock interactions. Within the last several years, a mix of both negatively and positively charged ions were added to the borehole located near the center of the cement emplaced in the formation. The researchers focused on a 1-centimeter-thick zone between the radionuclides and cement-clay referred to as the “skin.” They compared their experimental results to the software simulation, finding the two datasets aligned.

“The results are quite significant because previously, these models wouldn’t fit field data very well,” Sarsenbayev says. “It’s interesting how fine-scale phenomena at the ‘skin’ between cement and clay, the physical and chemical properties of which changes over time, could be used to reconcile the experimental and simulation data.” 

The experimental results showed the model successfully accounted for electrostatic effects associated with the clay-rich formation and the interaction between materials in Mont Terri over time.

“This is all driven by decades of work to understand what happens at these interfaces,” Sarsenbayev says. “It’s been hypothesized that there is mineral precipitation and porosity clogging at this interface, and our results strongly suggest that.”

“This application requires millions of degrees of freedom because these multibarrier systems require high resolution and a lot of computational power,” Sarsenbayev says. “This software is really ideal for the Mont Terri experiment.”

Assessing waste disposal plans

The new model could now replace older models that have been used to conduct safety and performance assessments of underground geological repositories.

“If the U.S. eventually decides to dispose nuclear waste in a geological repository, then these models could dictate the most appropriate materials to use,” Sarsenbayev says. “For instance, right now clay is considered an appropriate storage material, but salt formations are another potential medium that could be used. These models allow us to see the fate of radionuclides over millennia. We can use them to understand interactions at timespans that vary from months to years to many millions of years.”

Sarsenbayev says the model is reasonably accessible to other researchers and that future efforts may focus on the use of machine learning to develop less computationally expensive surrogate models.

Further data from the experiment will be available later this month. The team plans to compare those data to additional simulations.

“Our collaborators will basically get this block of cement and clay, and they’ll be able to run experiments to determine the exact thickness of the skin along with all of the minerals and processes present at this interface,” Sarsenbayev says. “It’s a huge project and it takes time, but we wanted to share initial data and this software as soon as we could.”

For now, the researchers hope their study leads to a long-term solution for storing nuclear waste that policymakers and the public can support.

“This is an interdisciplinary study that includes real world experiments showing we’re able to predict radionuclides’ fate in the subsurface,” Sarsenbayev says. “The motto of MIT’s Department of Nuclear Science and Engineering is ‘Science. Systems. Society.’ I think this merges all three domains.”

Helping cities evolve

Thu, 07/17/2025 - 4:50pm

Growing up in Paris, Vincent Rollet was exposed to the world beyond France from an early age. His dad was an engineer who traveled around the globe to set up electrical infrastructure, and he moved the family to the United States for two years when Rollet was a small child. His father’s work sparked Rollet’s interest in international development and growth. “It made me want to see and learn how things work in other parts of the world,” he says.

Today, Rollet is a fifth-year PhD student in MIT’s Department of Economics, studying how cities evolve — and how they may become constrained by their past. “Cities constantly need to adapt to economic changes,” he explains. “For example, you might need more housing as populations grow, or want to transform manufacturing spaces into modern lab facilities. With the rise of remote work, many cities now have excess office space that could potentially become residential housing.” Ultimately, Rollet hopes his research can influence urban policymakers to better serve city residents.

A happy accident

Rollet’s first exposure to economics was almost accidental. As a teenager, he stumbled upon the lecture videos of a game theory course at Yale University. “I randomly clicked on the available courses,” he said, “and I watched the videos, and I found it interesting.”

In high school and college, he focused on math and physics. “It’s the kind of training you’re typically pushed to do in France,” he says. But at the end of his first year at École Polytechnique — mandatory military training for all students — he remembered the Yale course that he had watched in high school. He had spent that year helping run a military service program for disadvantaged youth. “I was looking for an enjoyable way to start studying again,” he says. “So I went back to game theory.”

Rollet decided to take a game theory course with an economics professor, Pierre Boyer, who would play a key role in his academic path. Through conversations with Boyer, Rollet learned that economics could provide a rigorous, mathematical approach to understanding the topics around international development and international politics that had long fascinated him. Boyer introduced Rollet to two MIT-trained economists, professors Vincent Pons and Benjamin Marx, with whom he continues to collaborate today. A research visit to the U.S. in 2019 to work with them solidified his interest in pursuing graduate school. Shortly thereafter, he began his PhD at MIT.

Why cities get “stuck”

Rollet’s research explores why cities struggle to adapt their built environments as economic conditions shift, and why certain urban spaces become “stuck” in outdated patterns of development. He’s drawn to cities because they are a microcosm of different interacting systems in economics. “To understand cities, you need to understand how labor markets work, how the housing market works, and how transportation works,” he notes.

Rollet has spent most of his PhD focusing on New York City. By examining detailed data on building permits, real estate transactions, rents, and zoning changes, he has tracked the evolution of every building in the city over nearly two decades, studying when and why developers choose to demolish buildings and construct new ones, and how these decisions are influenced by economic, regulatory, and technological constraints. By combining computational theory and data — which often includes information on natural experiments (i.e., What happens when a city changes a regulation?) — Rollet aims to reveal generalizable principles underlying how cities grow and evolve.

Originally shaped as a manufacturing hub with dense commercial centers and sprawling residential outskirts, New York’s physical structure has been largely frozen since zoning regulations were imposed in the 1960s. Despite dramatic shifts in population and economic activity, the city’s regulations have barely budged, creating profound mismatches: soaring housing costs, overcrowded residential areas, and underutilized commercial spaces. The buildings are expensive to replace, and regulations are notoriously hard to change once they are established.

Rollet’s findings reveal critical inefficiencies. In cities like New York or Boston, housing often sells for hundreds of thousands of dollars more than it costs to build. This large gap suggests that demand far outpaces supply: There simply aren’t enough homes being built. “When the housing supply is too constrained, we are effectively wasting resources, making housing unnecessarily expensive,” he explains.

But implementing any kind of action or policy to alleviate these inefficiencies has downstream effects. For example, it can have different impacts on different groups of people. “There will be winners and losers,” Rollet explains. “One reason is that you might directly care about the welfare of a certain group, like directly providing housing for lower-income households. Another reason is that if there are sufficiently many people who are losers of a certain policy, or if they’re sufficiently powerful, they’re going to be able to block the policy change, and this poses a political constraint.”

So what makes a city “stuck”? “Much of the time,” Rollet says, “it’s policy.” But the effects of policy changes take time to materialize and might be difficult for people to detect. Rollet cites Cambridge’s recent zoning reform allowing the construction of six-story buildings as a case in point. “These policy changes can benefit a lot of people, by reducing the housing prices a bit for everyone,” he says, “but individual people won’t know it. This makes collective action very hard.”

Economics, however, provides a toolkit to characterize and quantify these effects. “What economists can bring to the table is to give policymakers more information on the likely consequences of their policy actions,” Rollet says.

Striving to “improve things”

As Rollet enters the home stretch of his PhD, he’s grateful to his advisors in the economics department for helping him develop a foundation for the diverse set of tools necessary for his work. From professors Dave Donaldson and David Atkin, he learned how to adapt methods traditionally used in the study of international trade, to analyze the movement of people across neighborhoods and cities. From Professor Tobias Salz, he gained insights into modeling the behavior of firms over time, which he now applies to understanding the actions of real estate developers. “The training here pushes you to produce research that truly stands out,“ he says. “The courses helped me discover a new set of fields and methods.”

Beyond research, Rollet actively contributes to his department, including serving as the co-president of the Graduate Economics Association. “MIT is truly the best place for economics, not just because of their courses, but because it’s a really friendly department where people help each other out,” he says. “The Graduate Economics Association helps to build that sense of community, and I wanted to be a part of that.” In addition, he is a member of a mental health and peer support group in the department.

Rollet also enjoys teaching. He has been a teaching assistant for microeconomics and international trade courses and has built an impressive writing repertoire explaining complex concepts in several fields. In high school, one of Rollet’s hobbies was writing quantum theory explainers on the internet for general audiences. Some publishers found his writing and contacted him about turning it into a book. The book was published, and has sold more than 14,000 copies. As a college student, Rollet worked on two books: one on game theory for general audiences, and an intro to economics textbook that two professors recruited him to co-author. It’s still the standard textbook at École Polytechnique today. “It was my Covid activity,” Rollet laughs.

Looking forward, Rollet aims to pursue a career in research and teaching. His immediate goal remains clear: develop research that meaningfully impacts policy, by shedding light on how cities can overcome constraints and evolve in ways that better serve their residents. He’s excited about how, in the future, more fine-grained and detailed data sources could shed light on how micro behavior can lead to macro outcomes.

"Housing and cities — these markets are failing in important ways in many parts of the world. There’s real potential for policy to improve things.”

Pages