MIT Latest News

Subscribe to MIT Latest News feed
MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community.
Updated: 13 hours 15 min ago

Meet the 2025 tenured professors in the School of Humanities, Arts, and Social Sciences

Thu, 09/18/2025 - 4:30pm

In 2025, six faculty were granted tenure in the MIT School of Humanities, Arts, and Social Sciences.

Sara Brown is an associate professor in the Music and Theater Arts Section. She develops stage designs for theater, opera, and dance by approaching the scenographic space as a catalyst for collective imagination. Her work is rooted in curiosity and interdisciplinary collaboration, and spans virtual environments, immersive performance installations, and evocative stage landscapes. Her recent projects include “Carousel” at the Boston Lyric Opera; the virtual dance performance “The Other Shore” at the Massachusetts Museum of Contemporary Art and Jacob’s Pillow; and “The Lehman Trilogy” at the Huntington Theatre Company. Her upcoming co-directed work, “Circlusion,” takes place within a fully immersive inflatable space and reimagines the female body’s response to power and violence. Her designs have been seen at the BAM Next Wave Festival in New York, the Festival d’Automne in Paris, and the American Repertory Theater in Cambridge.

Naoki Egami is a professor in the Department of Political Science. He is also a faculty affiliate of the MIT Institute for Data, Systems, and Society. Egami specializes in political methodology and develops statistical methods for questions in political science and the social sciences. His current research programs focus on three areas: external validity and generalizability; machine learning and AI for the social sciences; and causal inference with network and spatial data. His work has appeared in various academic journals in political science, statistics, and computer science, such as American Political Science Review, American Journal of Political Science, Journal of the American Statistical Association, Journal of the Royal Statistical Society (Series B), NeurIPS, and Science Advances. Before joining MIT, Egami was an assistant professor at Columbia University. He received a PhD from Princeton University (2020) and a BA from the University of Tokyo (2015).

Rachel Fraser is an associate professor in the Department of Linguistics and Philosophy. Before coming to MIT, Fraser taught at the University of Oxford, where she also completed her graduate work in philosophy. She has interests in epistemology, language, feminism, aesthetics, and political philosophy. At present, her main project is a book manuscript on the epistemology of narrative.

Brian Hedden PhD ’12 is a professor in the Department of Linguistics and Philosophy, with a shared appointment in the MIT Schwarzman College of Computing in the Department of Electrical Engineering and Computer Science. His research focuses on how we ought to form beliefs and make decisions. He works in epistemology, decision theory, and ethics, including ethics of AI. He is the author of “Reasons without Persons: Rationality, Identity, and Time” (Oxford University Press, 2015) and articles on topics including collective action problems, legal standards of proof, algorithmic fairness, and political polarization, among others. Prior to joining MIT, he was a faculty member at the Australian National University and the University of Sydney, and a junior research fellow at Oxford. He received his BA From Princeton University in 2006 and his PhD from MIT in 2012.

Viola Schmitt is an associate professor in the Department of Linguistics and Philosophy. She is a linguist with a special interest in semantics. Much of her work focuses on trying to understand general constraints on human language meaning; that is, the principles regulating which meanings can be expressed by human languages and how languages can package meaning. Variants of this question were also central to grants she received from the Austrian and German research foundations. She earned her PhD in linguistics from the University of Vienna and worked as a postdoc and/or lecturer at the Universities of Vienna, Graz, Göttingen, and at the University of California at Los Angeles. Her most recent position was as a junior professor at Humboldt University in Berlin.

Miguel Zenón is an associate professor in the Music and Theater Arts Section. The Puerto Rican alto saxophonist, composer, band leader, music producer, and educator is a Grammy Award winner, the recipient of a Guggenheim Fellowship, a MacArthur Fellowship, and a Doris Duke Artist Award. He also holds an honorary doctorate degree in the arts from Universidad del Sagrado Corazón. Zenón has released 18 albums as a band leader and collaborated with some of the great musicians and ensembles of his time. As a composer, Zenón has been commissioned by Chamber Music America, Logan Center for The Arts, The Hyde Park Jazz Festival, Miller Theater, The Hewlett Foundation, Peak Performances, and many of his peers. Zenón has given hundreds of lectures and master classes at institutions all over the world, and in 2011 he founded Caravana Cultural — a program that presents jazz concerts free of charge in rural areas of Puerto Rico.

Inflammation jolts “sleeping” cancer cells awake, enabling them to multiply again

Thu, 09/18/2025 - 3:40pm

Cancer cells have one relentless goal: to grow and divide. While most stick together within the original tumor, some rogue cells break away to traverse to distant organs. There, they can lie dormant — undetectable and not dividing — for years, like landmines waiting to go off.

This migration of cancer cells, called metastasis, is especially common in breast cancer. For many patients, the disease can return months — or even decades — after initial treatment, this time in an entirely different organ.

Robert Weinberg, the Daniel K. Ludwig Professor for Cancer Research at MIT and a Whitehead Institute for Biomedical Research founding member, has spent decades unraveling the complex biology of metastasis and pursuing research that could improve survival rates among patients with metastatic breast cancer — or prevent metastasis altogether.

In his latest study, Weinberg, postdoc Jingwei Zhang, and colleagues ask a critical question: What causes these dormant cancer cells to erupt into a frenzy of growth and division? The group’s findings, published Sept. 1 in The Proceedings of the National Academy of Sciences (PNAS), point to a unique culprit.

This awakening of dormant cancer cells, they’ve discovered, isn’t a spontaneous process. Instead, the wake-up call comes from the inflamed tissue surrounding the cells. One trigger for this inflammation is bleomycin, a common chemotherapy drug that can scar and thicken lung tissue.

“The inflammation jolts the dormant cancer cells awake,” Weinberg says. “Once awakened, they start multiplying again, seeding new life-threatening tumors in the body.”

Decoding metastasis

There’s a lot that scientists still don’t know about metastasis, but this much is clear: Cancer cells must undergo a long and arduous journey to achieve it. The first step is to break away from their neighbors within the original tumor.

Normally, cells stick to one another using surface proteins that act as molecular “velcro,” but some cancer cells can acquire genetic changes that disrupt the production of these proteins and make them more mobile and invasive, allowing them to detach from the parent tumor. 

Once detached, they can penetrate blood vessels and lymphatic channels, which act as highways to distant organs.

While most cancer cells die at some point during this journey, a few persist. These cells exit the bloodstream and invade different tissues—lungs, liver, bone, and even the brain — to give birth to new, often more-aggressive tumors.

“Almost 90 percent of cancer-related deaths occur not from the original tumor, but when cancer cells spread to other parts of the body,” says Weinberg. “This is why it’s so important to understand how these ‘sleeping’ cancer cells can wake up and start growing again.”

Setting up shop in new tissue comes with changes in surroundings — the “tumor microenvironment” — to which the cancer cells may not be well-suited. These cells face constant threats, including detection and attack by the immune system. 

To survive, they often enter a protective state of dormancy that puts a pause on growth and division. This dormant state also makes them resistant to conventional cancer treatments, which often target rapidly dividing cells.

To investigate what makes this dormancy reversible months or years down the line, researchers in the Weinberg Lab injected human breast cancer cells into mice. These cancer cells were modified to produce a fluorescent protein, allowing the scientists to track their behavior in the body.

The group then focused on cancer cells that had lodged themselves in the lung tissue. By examining them for specific proteins — Ki67, ITGB4, and p63 — that act as markers of cell activity and state, the researchers were able to confirm that these cells were in a non-dividing, dormant state.

Previous work from the Weinberg Lab had shown that inflammation in organ tissue can provoke dormant breast cancer cells to start growing again. In this study, the team tested bleomycin — a chemotherapy drug known to cause lung inflammation — that can be given to patients after surgery to lower the risk of cancer recurrence.

The researchers found that lung inflammation from bleomycin was sufficient to trigger the growth of large lung cancer colonies in treated mice — and to shift the character of these once-dormant cells to those that are more invasive and mobile.

Zeroing in on the tumor microenvironment, the team identified a type of immune cells, called M2 macrophages, as drivers of this process. These macrophages release molecules called epidermal growth factor receptor (EGFR) ligands, which bind to receptors on the surface of dormant cancer cells. This activates a cascade of signals that provoke dormant cancer cells to start multiplying rapidly. 

But EGFR signaling is only the initial spark that ignites the fire. “We found that once dormant cancer cells are awakened, they retain what we call an ‘awakening memory,’” Zhang says. “They no longer require ongoing inflammatory signals from the microenvironment to stay active [growing and multiplying] — they remember the awakened state.”

While signals related to inflammation are necessary to awaken dormant cancer cells, exactly how much signaling is needed remains unclear. “This aspect of cancer biology is particularly challenging, because multiple signals contribute to the state change in these dormant cells,” Zhang says.

The team has already identified one key player in the awakening process, but understanding the full set of signals and how each contributes is far more complex — a question they are continuing to investigate in their new work. 

Studying these pivotal changes in the lives of cancer cells — such as their transition from dormancy to active growth — will deepen our scientific understanding of metastasis and, as researchers in the Weinberg Lab hope, lead to more effective treatments for patients with metastatic cancers.

Biogen groundbreaking stirs optimism in Kendall Square

Thu, 09/18/2025 - 1:30pm

Nearly 300 people gathered Tuesday to mark the ceremonial groundbreaking for Biogen’s new state-of-the-art facility in Kendall Square. The project is the first building to be constructed at MIT’s Kendall Common on the former Volpe federal site, and will serve as a consolidated headquarters for the pioneering biotechnology company which has called Cambridge home for more than 40 years.

In marking the start of construction, Massachusetts Governor Maura Healey addressed the enthusiastic crowd, saying, “Massachusetts science saves lives — saves lives here, saves lives around the world. We celebrate that in Biogen today, we celebrate that in Kendall Common, and we celebrate that in this incredible ecosystem that extends all across our great state. Today, Biogen is not just building a new facility, they are building the future of medicine and innovation.”

Emceed by Kirk Taylor, president and CEO of the Massachusetts Life Sciences Center, the event featured a specially created Lego model of the new building and a historic timeline of Biogen’s origin story overlaid on Kendall Square’s transformation. The program’s theme — “Making breakthroughs happen in Kendall Square” — seemed to elicit a palpable sense of pride among the Biogen and MIT employees, business leaders, and public officials in attendance.

MIT President Sally Kornbluth reflected on the vibrancy of the local innovation ecosystem: “I sometimes say that Kendall Square’s motto might as well be ‘talent in proximity.’ By following that essential recipe, Biogen’s latest decision to intensify its presence here promises great things for the whole region.” Kornbluth described Biogen’s move as “a very important signal to the world right now.”

Biogen’s March 2025 announcement that it will centralize operations at 75 Broadway was lauded as a show of strength for the historic company and the life sciences sector. The 580,000-square-foot research and development headquarters, designed by Elkus Manfredi Architects, will optimize Biogen’s scientific discovery and clinical processes. The new facility is scheduled to open in 2028.

CEO Chris Veihbacher shared his thoughts on Biogen’s decision: “I am proud to stand here with so many individuals who have shaped our past and who are dedicated to our future in Kendall Square. … We decided to invest in the next chapter of Kendall Square because of what this community represents: talent, energy, ingenuity, and collaboration.” Biogen was founded in 1978 by Nobel laureates Phillip Sharp (an MIT Institute Professor and professor of biology emeritus) and Wally Gilbert, both of whom were not only present, but received an impromptu standing ovation, led by Viehbacher.

Kendall Common is being developed by MIT’s Investment Management Company (MITIMCo) and will ultimately include four commercial buildings, four residential buildings (including affordable housing), open space, retail, entertainment, and a community center. MITIMCo’s joint venture partner for the Biogen project is BioMed Realty, a Blackstone Real Estate portfolio company.

Senior Vice President Patrick Rowe, who oversees MITIMCo’s real estate group, says, “Biogen is such a critical anchor for the area. I’m excited for the impact that this project will have on Kendall Square, and for the way that the Kendall Common development can help to further advance our innovation ecosystem.”

Could a primordial black hole’s last burst explain a mysteriously energetic neutrino?

Thu, 09/18/2025 - 12:00am

The last gasp of a primordial black hole may be the source of the highest-energy “ghost particle” detected to date, a new MIT study proposes.

In a paper appearing today in Physical Review Letters, MIT physicists put forth a strong theoretical case that a recently observed, highly energetic neutrino may have been the product of a primordial black hole exploding outside our solar system.

Neutrinos are sometimes referred to as ghost particles, for their invisible yet pervasive nature: They are the most abundant particle type in the universe, yet they leave barely a trace. Scientists recently identified signs of a neutrino with the highest energy ever recorded, but the source of such an unusually powerful particle has yet to be confirmed.

The MIT researchers propose that the mysterious neutrino may have come from the inevitable explosion of a primordial black hole. Primordial black holes (PBHs) are hypothetical black holes that are microscopic versions of the much more massive black holes that lie at the center of most galaxies. PBHs are theorized to have formed in the first moments following the Big Bang. Some scientists believe that primordial black holes could constitute most or all of the dark matter in the universe today.

Like their more massive counterparts, PBHs should leak energy and shrink over their lifetimes, in a process known as Hawking radiation, which was predicted by the physicist Stephen Hawking. The more a black hole radiates, the hotter it gets and the more high-energy particles it releases. This is a runaway process that should produce an incredibly violent explosion of the most energetic particles just before a black hole evaporates away.

The MIT physicists calculate that, if PBHs make up most of the dark matter in the universe, then a small subpopulation of them would be undergoing their final explosions today throughout the Milky Way galaxy. And, there should be a statistically significant possibility that such an explosion could have occurred relatively close to our solar system. The explosion would have released a burst of high-energy particles, including neutrinos, one of which could have had a good chance of hitting a detector on Earth.

If such a scenario had indeed occurred, the recent detection of the highest-energy neutrino would represent the first observation of Hawking radiation, which has long been assumed, but has never been directly observed from any black hole. What’s more, the event might indicate that primordial black holes exist and that they make up most of dark matter — a mysterious substance that comprises 85 percent of the total matter in the universe, the nature of which remains unknown.

“It turns out there’s this scenario where everything seems to line up, and not only can we show that most of the dark matter [in this scenario] is made of primordial black holes, but we can also produce these high-energy neutrinos from a fluke nearby PBH explosion,” says study lead author Alexandra Klipfel, a graduate student in MIT’s Department of Physics. “It’s something we can now try to look for and confirm with various experiments.”

The study’s other co-author is David Kaiser, professor of physics and the Germeshausen Professor of the History of Science at MIT.

High-energy tension

In February, scientists at the Cubic Kilometer Neutrino Telescope, or KM3NeT, reported the detection of the highest-energy neutrino recorded to date. KM3NeT is a large-scale underwater neutrino detector located at the bottom of the Mediterranean Sea, where the environment is meant to mute the effects of any particles other than neutrinos.

The scientists operating the detector picked up signatures of a passing neutrino with an energy of over 100 peta-electron-volts. One peta-electron volt is equivalent to the energy of 1 quadrillion electron volts.

“This is an incredibly high energy, far beyond anything humans are capable of accelerating particles up to,” Klipfel says. “There’s not much consensus on the origin of such high-energy particles.”

Similarly high-energy neutrinos, though not as high as what KM3NeT observed, have been detected by the IceCube Observatory — a neutrino detector embedded deep in the ice at the South Pole. IceCube has detected about half a dozen such neutrinos, whose unusually high energies have also eluded explanation. Whatever their source, the IceCube observations enable scientists to work out a plausible rate at which neutrinos of those energies typically hit Earth. If this rate were correct, however, it would be extremely unlikely to have seen the ultra-high-energy neutrino that KM3NeT recently detected. The two detectors’ discoveries, then, seemed to be what scientists call “in tension.”

Kaiser and Klipfel, who had been working on a separate project involving primordial black holes, wondered: Could a PBH have produced both the KM3NeT neutrino and the handful of IceCube neutrinos, under conditions in which PBHs comprise most of the dark matter in the galaxy? If they could show a chance existed, it would raise an even more exciting possibility — that both observatories observed not only high-energy neutrinos but also the remnants of Hawking radiation.

“Our best chance”

The first step the scientists took in their theoretical analysis was to calculate how many particles would be emitted by an exploding black hole. All black holes should slowly radiate over time. The larger a black hole, the colder it is, and the lower-energy particles it emits as it slowly evaporates. Thus, any particles that are emitted as Hawking radiation from heavy stellar-mass black holes would be near impossible to detect. By the same token, however, much smaller primordial black holes would be very hot and emit high-energy particles in a process that accelerates the closer the black hole gets to disappearing entirely.

“We don’t have any hope of detecting Hawking radiation from astrophysical black holes,” Klipfel says. “So if we ever want to see it, the smallest primordial black holes are our best chance.”

The researchers calculated the number and energies of particles that a black hole should emit, given its temperature and shrinking mass. In its final nanosecond, they estimate that once a black hole is smaller than an atom, it should emit a final burst of particles, including about 1020 neutrinos, or about a sextillion of the particles, with energies of about 100 peta-electron-volts (around the energy that KM3NeT observed).

They used this result to calculate the number of PBH explosions that would have to occur in a galaxy in order to explain the reported IceCube results. They found that, in our region of the Milky Way galaxy, about 1,000 primordial black holes should be exploding per cubic parsec per year. (A parsec is a unit of distance equal to about 3 light years, which is more than 10 trillion kilometers.)

They then calculated the distance at which one such explosion in the Milky Way could have occurred, such that just a handful of the high-energy neutrinos could have reached Earth and produced the recent KM3NeT event. They find that a PBH would have to explode relatively close to our solar system — at a distance about 2,000 times further than the distance between the Earth and our sun.

The particles emitted from such a nearby explosion would radiate in all directions. However, the team found there is a small, 8 percent chance that an explosion can happen close enough to the solar system, once every 14 years, such that enough ultra-high-energy neutrinos hit the Earth.

“An 8 percent chance is not terribly high, but it’s well within the range for which we should take such chances seriously — all the more so because so far, no other explanation has been found that can account for both the unexplained very-high-energy neutrinos and the even more surprising ultra-high-energy neutrino event,” Kaiser says.

The team’s scenario seems to hold up, at least in theory. To confirm their idea will require many more detections of particles, including neutrinos at “insanely high energies.” Then, scientists can build up better statistics regarding such rare events.

“In that case, we could use all of our combined experience and instrumentation, to try to measure still-hypothetical Hawking radiation,” Kaiser says. “That would provide the first-of-its-kind evidence for one of the pillars of our understanding of black holes — and could account for these otherwise anomalous high-energy neutrino events as well. That’s a very exciting prospect!”

In tandem, other efforts to detect nearby PBHs could further bolster the hypothesis that these unusual objects make up most or all of the dark matter.

This work was supported, in part, by the National Science Foundation, MIT’s Center for Theoretical Physics – A Leinweber Institute, and the U.S. Department of Energy.

New 3D bioprinting technique may improve production of engineered tissue

Wed, 09/17/2025 - 11:02am

The field of tissue engineering aims to replicate the structure and function of real biological tissues. This engineered tissue has potential applications in disease modeling, drug discovery, and implantable grafts.

3D bioprinting, which uses living cells, biocompatible materials, and growth factors to build three-dimensional tissue and organ structures, has emerged as a key tool in the field. To date, one of the most-used approaches for bioprinting relies on additive manufacturing techniques and digital models, depositing 2D layers of bio-inks, composed of cells in a soft gel, into a support bath, layer-by-layer, to build a 3D structure. While these techniques do enable fabrication of complex architectures with features that are not easy to build manually, current approaches have limitations.

“A major drawback of current 3D bioprinting approaches is that they do not integrate process control methods that limit defects in printed tissues. Incorporating process control could improve inter-tissue reproducibility and enhance resource efficiency, for example limiting material waste,” says Ritu Raman, the Eugene Bell Career Development Chair of Tissue Engineering and an assistant professor of mechanical engineering.

She adds, “given the diverse array of available 3D bioprinting tools, there is a significant need to develop process optimization techniques that are modular, efficient, and accessible.”

The need motivated Raman to seek the expertise of Professor Bianca Colosimo of the Polytechnic University of Milan, also known as Polimi. Colosimo recently completed a sabbatical at MIT, which was hosted by John Hart, Class of 1922 Professor, co-director of MIT’s Initiative for New Manufacturing, director of the Center for Advanced Production Technologies, and head of the Department of Mechanical Engineering.

“Artificial Intelligence and data mining are already reshaping our daily lives, and their impact will be even more profound in the emerging field of 3D bioprinting, and in manufacturing at large,” says Colosimo. During her MIT sabbatical, she collaborated with Raman and her team to co-develop a solution that represents a first step toward intelligent bioprinting.

“This solution is now available in both our labs at Polimi and MIT, serving as a twin platform to exchange data and results across different environments and paving the way for many new joint projects in the years to come,” Colosimo says.

A new paper by Raman, Colosimo, and lead authors Giovanni Zanderigo, a Rocca Fellow at Polimi, and Ferdows Afghah of MIT published this week in the journal Device presents a novel technique that addresses this challenge. The team built and validated a modular, low-cost, and printer-agnostic monitoring technique that integrates a compact tool for layer-by-layer imaging. In their method, a digital microscope captures high-resolution images of tissues during printing and rapidly compares them to the intended design with an AI-based image analysis pipeline.

“This method enabled us to quickly identify print defects, such as depositing too much or too little bio-ink, thus helping us identify optimal print parameters for a variety of different materials,” says Raman. “The approach is a low-cost — less than $500 — scalable, and adaptable solution that can be readily implemented on any standard 3D bioprinter. Here at MIT, the monitoring platform has already been integrated into the 3D bioprinting facilities in The SHED. Beyond MIT, our research offers a practical path toward greater reproducibility, improved sustainability, and automation in the field of tissue engineering. This research could have a positive impact on human health by improving the quality of the tissues we fabricate to study and treat debilitating injuries and disease.”

The authors indicate that the new method is more than a monitoring tool. It also ‎serves as a foundation for intelligent process control in embedded bioprinting. By enabling real-‎time inspection, adaptive correction, and automated parameter tuning, the researchers anticipate that the approach can improve ‎reproducibility, reduce material waste, and accelerate process optimization‎ for real-world applications in tissue engineering.

A more precise way to edit the genome

Wed, 09/17/2025 - 11:00am

A genome-editing technique known as prime editing holds potential for treating many diseases by transforming faulty genes into functional ones. However, the process carries a small chance of inserting errors that could be harmful.

MIT researchers have now found a way to dramatically lower the error rate of prime editing, using modified versions of the proteins involved in the process. This advance could make it easier to develop gene therapy treatments for a variety of diseases, the researchers say.

“This paper outlines a new approach to doing gene editing that doesn’t complicate the delivery system and doesn’t add additional steps, but results in a much more precise edit with fewer unwanted mutations,” says Phillip Sharp, an MIT Institute Professor Emeritus, a member of MIT’s Koch Institute for Integrative Cancer Research, and one of the senior authors of the new study.

With their new strategy, the MIT team was able to improve the error rate of prime editors from about one error in seven edits to one in 101 for the most-used editing mode, or from one error in 122 edits to one in 543 for a high-precision mode.

“For any drug, what you want is something that is effective, but with as few side effects as possible,” says Robert Langer, the David H. Koch Institute Professor at MIT, a member of the Koch Institute, and one of the senior authors of the new study. “For any disease where you might do genome editing, I would think this would ultimately be a safer, better way of doing it.”

Koch Institute research scientist Vikash Chauhan is the lead author of the paper, which appears today in Nature.

The potential for error

The earliest forms of gene therapy, first tested in the 1990s, involved delivering new genes carried by viruses. Subsequently, gene-editing techniques that use enzymes such as zinc finger nucleases to correct genes were developed. These nucleases are difficult to engineer, however, so adapting them to target different DNA sequences is a very laborious process.

Many years later, the CRISPR genome-editing system was discovered in bacteria, offering scientists a potentially much easier way to edit the genome. The CRISPR system consists of an enzyme called Cas9 that can cut double-stranded DNA at a particular spot, along with a guide RNA that tells Cas9 where to cut. Researchers have adapted this approach to cut out faulty gene sequences or to insert new ones, following an RNA template.

In 2019, researchers at the Broad Institute of MIT and Harvard reported the development of prime editing: a new system, based on CRISPR, that is more precise and has fewer off-target effects. A recent study reported that prime editors were successfully used to treat a patient with chronic granulomatous disease (CGD), a rare genetic disease that affects white blood cells.

“In principle, this technology could eventually be used to address many hundreds of genetic diseases by correcting small mutations directly in cells and tissues,” Chauhan says.

One of the advantages of prime editing is that it doesn’t require making a double-stranded cut in the target DNA. Instead, it uses a modified version of Cas9 that cuts just one of the complementary strands, opening up a flap where a new sequence can be inserted. A guide RNA delivered along with the prime editor serves as the template for the new sequence.

Once the new sequence has been copied, however, it must compete with the old DNA strand to be incorporated into the genome. If the old strand outcompetes the new one, the extra flap of new DNA hanging off may accidentally get incorporated somewhere else, giving rise to errors.

Many of these errors might be relatively harmless, but it’s possible that some could eventually lead to tumor development or other complications. With the most recent version of prime editors, this error rate ranges from one per seven edits to one per 121 edits for different editing modes.

“The technologies we have now are really a lot better than earlier gene therapy tools, but there’s always a chance for these unintended consequences,” Chauhan says.

Precise editing

To reduce those error rates, the MIT team decided to take advantage of a phenomenon they had observed in a 2023 study. In that paper, they found that while Cas9 usually cuts in the same DNA location every time, some mutated versions of the protein show a relaxation of those constraints. Instead of always cutting the same location, those Cas9 proteins would sometimes make their cut one or two bases further along the DNA sequence.

This relaxation, the researchers discovered, makes the old DNA strands less stable, so they get degraded, making it easier for the new strands to be incorporated without introducing any errors.

In the new study, the researchers were able to identify Cas9 mutations that dropped the error rate to 1/20th its original value. Then, by combining pairs of those mutations, they created a Cas9 editor that lowered the error rate even further, to 1/36th the original amount.

To make the editors even more accurate, the researchers incorporated their new Cas9 proteins into a prime editing system that has an RNA binding protein that stabilizes the ends of the RNA template more efficiently. This final editor, which the researchers call vPE, had an error rate just 1/60th of the original, ranging from one in 101 edits to one in 543 edits for different editing modes. These tests were performed in mouse and human cells.

The MIT team is now working on further improving the efficiency of prime editors, through further modifications of Cas9 and the RNA template. They are also working on ways to deliver the editors to specific tissues of the body, which is a longstanding challenge in gene therapy.

They also hope that other labs will begin using the new prime editing approach in their research studies. Prime editors are commonly used to explore many different questions, including how tissues develop, how populations of cancer cells evolve, and how cells respond to drug treatment.

“Genome editors are used extensively in research labs,” Chauhan says. “So the therapeutic aspect is exciting, but we are really excited to see how people start to integrate our editors into their research workflows.”

The research was funded by the Life Sciences Research Foundation, the National Institute of Biomedical Imaging and Bioengineering, the National Cancer Institute, and the Koch Institute Support (core) Grant from the National Cancer Institute.

Working to make fusion a viable energy source

Wed, 09/17/2025 - 11:00am

George Tynan followed a nonlinear path to fusion.

Following his undergraduate degree in aerospace engineering, Tynann's work in the industry spurred his interest in rocket propulsion technology. Because most methods for propulsion involve the manipulation of hot ionized matter, or plasmas, Tynan focused his attention on plasma physics.

It was then that he realized that plasmas could also drive nuclear fusion. “As a potential energy source, it could really be transformative, and the idea that I could work on something that could have that kind of impact on the future was really attractive to me,” he says.

That same drive, to realize the promise of fusion by researching both plasma physics and fusion engineering, drives Tynan today. It’s work he will be pursuing as the Norman C. Rasmussen Adjunct Professor in the Department of Nuclear Science and Engineering (NSE) at MIT.

An early interest in fluid flow

Tynan’s enthusiasm for science and engineering traces back to his childhood. His electrical engineer father found employment in the U.S. space program and moved the family to Cape Canaveral in Florida.

“This was in the ’60s, when we were launching Saturn V to the moon, and I got to watch all the launches from the beach,” Tynan remembers. That experience was formative and Tynan became fascinated with how fluids flow.

“I would stick my hand out the window and pretend it was an airplane wing and tilt it with oncoming wind flow and see how the force would change on my hand,” Tynan laughs. The interest eventually led to an undergraduate degree in aerospace engineering at California State Polytechnic University in Pomona.

The switch to a new career would happen after work in the private sector, when Tynan discovered an interest in the use of plasmas for propulsion systems. He moved to the University of California at Los Angeles for graduate school, and it was here that the realization that plasmas could also anchor fusion moved Tynan into this field.

This was in the ’80s, when climate change was not as much in the public consciousness as it is today. Even so, “I knew there’s not an infinite amount of oil and gas around, and that at some point we would have to have widespread adoption of nuclear-based sources,” Tynan remembers. He was also attracted by the sustained effort it would take to make fusion a reality.

Doctoral work

To create energy from fusion, it’s important to get an accurate measurement of the “energy confinement time,” which is a measure of how long it takes for the hot fuel to cool down when all heat sources are turned off. When Tynan started graduate school, this measure was still an empirical guess. He decided to focus his research on the physics of observable confinement time.

It was during this doctoral research that Tynan was able to study the fundamental differences in the behavior of turbulence in plasma as compared to conventional fluids. Typically, when an ordinary fluid is stirred with increasing vigor, the fluid’s motion eventually becomes chaotic or turbulent. However, plasmas can act in a surprising way: confined plasmas, when heated sufficiently strongly, would spontaneously quench the turbulent transport at the boundary of the plasma

An experiment in Germany had unexpectedly discovered this plasma behavior. While subsequent work on other experimental devices confirmed this surprising finding, all earlier experiments lacked the ability to measure the turbulence in detail.

Brian LaBombard, now a senior research scientist at MIT’s Plasma Science and Fusion Center (PSFC), was a postdoc at UCLA at the time. Under LaBombard’s direction, Tynan developed a set of Langmuir probes, which are reasonably simple diagnostics for plasma turbulence studies, to further investigate this unusual phenomenon. It formed the basis for his doctoral dissertation. “I happened to be at the right place at the right time so I could study this turbulence quenching phenomenon in much more detail than anyone else could, up until that time,” Tynan says.

As a PhD student and then postdoc, Tynan studied the phenomenon in depth, shuttling between research facilities in Germany, Princeton University’s Plasma Physics Laboratory, and UCLA.

Fusion at UCSD

After completing his doctorate and postdoctoral work, Tynan worked at a startup for a few years when he learned that the University of California at San Diego was launching a new fusion research group at the engineering school. When they reached out, Tynan joined the faculty and built a research program focused on plasma turbulence and plasma-material interactions in fusion systems. Eventually, he became associate dean of engineering, and later, chair of the Department of Mechanical and Aerospace Engineering, serving in these roles for nearly a decade.

Tynan visited MIT on sabbatical in 2023, when his conversations with NSE faculty members Dennis Whyte, Zach Hartwig, and Michael Short excited him about the challenges the private sector faces in making fusion a reality. He saw opportunities to solve important problems at MIT that complemented his work at UC San Diego.

Tynan is excited to tackle what he calls, “the big physics and engineering challenges of fusion plasmas” at NSE: how to remove the heat and exhaust generated by burning plasma so it doesn’t damage the walls of the fusion device and the plasma does not choke on the helium ash. He also hopes to explore robust engineering solutions for practical fusion energy, with a particular focus on developing better materials for use in fusion devices that will make them longer-lasting, while  minimizing the production of radioactive waste.

“Ten or 15 years ago, I was somewhat pessimistic that I would ever see commercial exploitation of fusion in my lifetime,” Tynan says. But that outlook has changed, as he has seen collaborations between MIT and Commonwealth Fusion Systems (CFS) and other private-sector firms that seek to accelerate the timeline to the deployment of fusion in the real world.

In 2021, for example, MIT’s PSFC and CFS took a significant step toward commercial carbon-free power generation. They designed and built a high-temperature superconducting magnet, the strongest fusion magnet in the world.

The milestone was especially exciting because the promise of realizing the dream of fusion energy now felt closer. And being at MIT “seemed like a really quick way to get deeply connected with what’s going on in the efforts to develop fusion energy,” Tynan says.

In addition, “while on sabbatical at MIT, I saw how quickly research staff and students can capitalize on a suggestion of a new idea, and that intrigued me,” he adds.

Tynan brings his special blend of expertise to the table. In addition to extensive experience in plasma physics, he has spent a lot more time on hardcore engineering issues like materials, as well. “The key is to integrate the whole thing into a workable and viable system,” Tynan says.

Q&A: David Whelihan on the challenges of operating in the Arctic

Wed, 09/17/2025 - 11:00am

To most, the Arctic can feel like an abstract place, difficult to imagine beyond images of ice and polar bears. But researcher David Whelihan of MIT Lincoln Laboratory's Advanced Undersea Systems and Technology Group is no stranger to the Arctic. Through Operation Ice Camp, a U.S. Navy–sponsored biennial mission to assess operational readiness in the Arctic region, he has traveled to this vast and remote wilderness twice over the past few years to test low-cost sensor nodes developed by the group to monitor loss in Arctic sea ice extent and thickness. The research team envisions establishing a network of such sensors across the Arctic that will persistently detect ice-fracturing events and correlate these events with environmental conditions to provide insights into why the sea ice is breaking up. Whelihan shared his perspectives on why the Arctic matters and what operating there is like.

Q: Why do we need to be able to operate in the Arctic?

A: Spanning approximately 5.5 million square miles, the Arctic is huge, and one of its salient features is that the ice covering much of the Arctic Ocean is decreasing in volume with every passing year. Melting ice opens up previously impassable areas, resulting in increasing interest from potential adversaries and allies alike for activities such as military operations, commercial shipping, and natural resource extraction. Through Alaska, the United States has approximately 1,060 miles of Arctic coastline that is becoming much more accessible because of reduced ice cover. So, U.S. operation in the Arctic is a matter of national security.  

Q: What are the technological limitations to Arctic operations?

A: The Arctic is an incredibly harsh environment. The cold kills battery life, so collecting sensor data at high rates over long periods of time is very difficult. The ice is dynamic and can easily swallow or crush sensors. In addition, most deployments involve "boots-on-the-ice," which is expensive and at times dangerous. One of the technological limitations is how to deploy sensors while keeping humans alive.

Q: How does the group's sensor node R&D work seek to support Arctic operations?

A: A lot of the work we put into our sensors pertains to deployability. Our ultimate goal is to free researchers from going onto the ice to deploy sensors. This goal will become increasingly necessary as the shrinking ice pack becomes more dynamic, unstable, and unpredictable. At the last Operation Ice Camp (OIC) in March 2024, we built and rapidly tested deployable and recoverable sensors, as well as novel concepts such as using UAVs (uncrewed aerial vehicles), or drones, as "data mules" that can fly out to and interrogate the sensors to see what they captured. We also built a prototype wearable system that cues automatic download of sensor data over Wi-Fi so that operators don't have to take off their gloves.

Q: The Arctic Circle is the northernmost region on Earth. How do you reach this remote place?

A: We usually fly on commercial airlines from Boston to Seattle to Anchorage to Prudhoe Bay on the North Slope of Alaska. From there, the Navy flies us on small prop planes, like Single and Twin Otters, about 200 miles north and lands us on an ice runway built by the Navy's Arctic Submarine Lab (ASL). The runway is part of a temporary camp that ASL establishes on floating sea ice for their operational readiness exercises conducted during OIC.

Q: Think back to the first time you stepped foot in the Arctic. Can you paint a picture of what you experienced?

A: My first experience was at Prudhoe Bay, coming out of the airport, which is a corrugated metal building with a single gate. Before you open the door to the outside, a sign warns you to be on the lookout for polar bears. Walking out into the sheer desolation and blinding whiteness of everything made me realize I was experiencing something very new.

When I flew out onto the ice and stepped out of the plane, I was amazed that the area could somehow be even more desolate. Bright white snowy ice goes in every direction, broken up by pressure ridges that form when ice sheets collide. The sun is low, and seems to move horizontally only. It is very hard to tell the time. The air temperature is really variable. On our first trip in 2022, it really wasn't (relatively) that cold — only around minus 5 or 10 degrees during the day. On our second trip in 2024, we were hit by minus 30 almost every day, and with winds of 20 to 25 miles per hour. The last night we were on the ice that year, it warmed up a bit to minus 10 to 20, but the winds kicked up and started blowing snow onto the heaters attached to our tents. Those heaters started failing one by one as the blowing snow covered them, blocking airflow. After our heater failed, I asked myself, while warm in my bed, whether I wanted to go outside to the command tent for help or try to make it until dawn in my thick sleeping bag. I picked the first option, but mostly because the heater control was beeping loudly right next to my bunk, so I couldn’t sleep anyway. Shout-out to the ASL staff who ran around fixing heaters all night!

Q: How do you survive in a place generally inhospitable to humans?

A: In partnership with the native population, ASL brings a lot of gear — from insulated, heated tents and communications equipment to large snowblowers to keep the runway clear. A few months before OIC, participants attend training on what conditions you will be exposed to and how to protect yourself through appropriate clothing, and how to use survival gear in case of an emergency.

Q: Do you have plans to return to the Arctic?  

A: We are hoping to go back this winter as part of OIC 2026! We plan to test a through-ice communication device. Communicating through 4 to 12 feet of ice is pretty tricky but could allow us to connect underwater drones and stationary sensors under the ice to the rest of the world. To support the through-ice communication system, we will repurpose our sensor-node boxes deployed during OIC 2024. If this setup works, those same boxes could be used as control centers for all sorts of undersea systems and relay information about the under-ice world back home via satellite.

Q: What lessons learned will you bring to your upcoming trip, and any potential future trips?

A: After the first trip, I had a visceral understanding of how hard operating there is. Prototyping of systems becomes a different game. Prototypes are often fragile, but fragility doesn't go over too well on the ice. So, there is a robustification step, which can take some time.

On this last trip, I realized that you have to really be careful with your energy expenditure and pace yourself. While the average adult may require about 2,000 calories a day, an Arctic explorer may burn several times more than that exerting themselves (we do a lot of walking around camp) and keeping warm. Usually, we live on the same freeze-dried food that you would take on camping trips. Each package only has so many calories, so you find yourself eating multiple of those and supplementing with lots of snacks such as Clif Bars or, my favorite, Babybel cheeses (which I bring myself). You also have to be really careful of dehydration. Your body's reaction to extreme cold is to reduce blood flow to your skin, which generally results in less liquid in your body. We have to drink constantly — water, cocoa, and coffee — to avoid dehydration.

We only have access to the ice every two years with the Navy, so we try to make the most of our time. In the several-day lead-up to our field expedition, my research partner Ben and I were really pushing ourselves to ready our sensor nodes for deployment and probably not eating and drinking as regularly as we should. When we ventured to our sensor deployment site about 5 kilometers outside of camp, I had to learn to slow down so I didn't sweat under my gear, as sweating in the extremely cold conditions can quickly lead to hypothermia. I also learned to pay more attention to exposed places on my face, as I got a bit of frostnip around my goggles.

Operating in the Arctic is a fine balance: you can't spend too much time out there, but you also can't rush.

Pages