MIT Latest News

Subscribe to MIT Latest News feed
MIT News is dedicated to communicating to the media and the public the news and achievements of the students, faculty, staff and the greater MIT community.
Updated: 4 hours 40 min ago

Credit where it’s due

Wed, 03/26/2025 - 12:00am

When most people buy cars, the sticker price is only part of the cost. The other part involves the loan, since folks usually borrow money for auto purchases. Therefore the interest rate, monthly payment size, and total repayment cost all matter too.

And yet, on aggregate, people do more comparison shopping about car prices than about lenders, and they frequently settle for relatively expensive loans. What happens when the financing costs more? The answer is, people buy older cars with lower sticker prices.

“The car they’re driving right now could be a year older because of that,” says Christopher Palmer PhD ’14, an associate professor of finance at the MIT Sloan School of Management, who helped discover this phenomenon through a study examining millions of U.S. car loans. That research is like much of Palmer’s work: grounded in hard data and shining new light on issues, even familiar ones, about personal money management.

“I study household financial decision-making,” Palmer says. “Both how households make decisions and how those decisions are influenced by external factors. That covers a lot of things.”

It sure does. Palmer, often working with co-authors, has also discovered that people prefer to make monthly payments that are multiples of $100 — which can lead them to agree to worse financing terms. And since household finance includes housing, Palmer co-authored a high-profile study showing that people are remarkably more likely to use housing vouchers and move to another neighborhood when they have a modest amount of assistance from a “navigator” who helps with the move.

But he isn’t just looking for behavioral quirks: Another Palmer study found that the Federal Reserve’s quantitative easing efforts after the financial crisis of 2008 helped cash-strapped people refinance their mortgages — though mostly those who had been able to make a down payment of 20 percent or more in the first place.

Overall, Palmer looks at big-picture economic scenarios in which people feel a financial crunch, and at consumer behavior, especially involving credit.

“If you look at whether someone can make a monthly payment, you need to understand their labor market, their expectations for the future, and more,” Palmer says. “Credit markets are interconnected to almost everything you might care about. Part of the reason I’m trying to shine a light on consumer credit markets is that they affect all kinds of human outcomes.”

For his research and teaching, Palmer earned tenure at MIT last year.

Useful intuition

Palmer grew up in the Boston area and enjoyed math in school, while always being interested in how people made financial decisions, especially about real estate. As an undergraduate at Brigham Young University, he soon recognized that he wanted to use his math skills to analyze everyday phenomena.

“I like the way you can take your intuition and have it be useful as you work through problems, along with this element of being able to observe what’s happening around you and being a listener in the world,” Palmer says.

As a student, though, that didn’t mean Palmer narrowed his interests. If anything, he saw the value in widening his studies.

“I also pretty quickly realized in college that I wanted to double major in econ and math,” Palmer says. “And that became the pipeline to get a PhD.”

After graduating from BYU, Palmer entered the doctoral program at MIT in 2008. In addition to taking classes, he immediately started working as a research assistant on a study of rent control along with professors David Autor — his eventual advisor — and Parag Pathak. That research eventually turned into a couple of high-profile papers. But while rent control is a kind of household-finance issue, the subject of household finance wasn’t really an established subdiscipline at the time.

It soon would be, however. Indeed, Palmer’s graduate-school career is almost a case study in how academic research broadens and evolves over time. Just as Palmer enrolled at MIT, the subprime-lending implosion helped generate the financial-markets crash of 2008, and both became greater focal points for academic research. Suddenly the topics that had been percolating around in Palmer’s mind were in pressing need of academic research.

“All of a sudden mortgages and household finance were front and center,” Palmer says. “That allowed me the space to write a dissertation about how distressed income households make mortgage decisions. There was an appetite for that.”

After receiving his PhD, Palmer joined the faculty at the University of California at Berkeley, at the Haas School of Business, and then moved back to MIT in 2017.

“Household finance as a field is small, so you have to intersect it with something else if you want your question to make a difference in the world,” Palmer says. “For me, that might be macroeconomics, labor economics, corporate finance, or banking. This is partly why MIT is an amazing place to be, because it’s so easy to get exposure to all of those fields.”

Keeping a list of questions at hand

With a wide-ranging research portfolio, Palmer has to be nimble about identifying topics he can study in depth. That means looking for good data related to household finance and consumer credit, and shaping his studies around meaningful questions.

“I think a good microeconomist is always on the hunt for things,” Palmer says.

“I’ve always wanted to be question-driven,” he adds. “I try to have a list of questions in mind, so that if somebody says, ‘I have an interesting data set, what can we do with it?’ I might have ideas about what in the data we can look at.”

Take the massive study on auto loans, which arose after a co-author approached Palmer and said, more or less, that he had identified an interesting data set and was wondering what to do with it. One unresolved question was: How much do people search for the best car price or the best loan terms?

As a graduate student, Palmer recalls, “I remembered [MIT professor] Glenn Ellison once saying in class that the subject of search is a really juicy topic. Consumers face tricky decisions, and companies do not want to make it easy for people to comparison-shop. And no one had done much about search in household finance.”

So, Palmer and his colleagues based the auto-loan study partly around the search issue. The work analyzes the geographic locations of millions of buyers, and the number of lenders within 20-minute drive of them, and examines how thoroughly consumers hunt for the best deals. The study includes credit scores, auto prices, and loan terms, illuminating the complete dynamics involving credit and auto purchases. 

Best behavior

Some of Palmer’s work, meanwhile, takes the form of experiments. The paper he co-authored about what helps people move was one such case. It was set in Seattle, and the research team collaborated with local policymakers to construct an experiment on the subject.

It turns out that in Seattle, among people granted housing vouchers to move to new neighborhoods, the percentage actually utilizing the vouchers jumped from 15 percent to 53 percent — an eye-opening change — when they were given slightly more information and resources, and most of all a “navigator” helping with basic logistics.

Studying how people manage money means Palmer’s work yields plenty of insights in the mode of behavioral economics, the subfield that studies irrationalities — or lack thereof — in finance. Palmer thinks such findings are important, while emphasizing that he is not principally on a hunt for irrationality. Instead he always seeks to link the study of behavior to major economic and policy matters: how we borrow, what we can afford, and how we respond to economic stress.

“When a study of behavior is motivated by a tight connection to public policy, it satisfies the is-this-important hurdle right away,” Palmer says. “I’m always aiming to produce work that a large community of scholars would find important and that the broader world would find impactful.”

Women’s swimming and diving wins first NCAA Division III National Championship

Tue, 03/25/2025 - 3:00pm

The MIT women's swimming and diving team won the program's first national championship, jumping ahead of New York University by erasing a 20-point deficit as the Engineers finished with 497 points at the 2025 NCAA Women's Swimming and Diving National Championships, hosted by the Old Dominion Athletic Conference March 19-22 at the Greensboro Aquatic Center in Greensboro, North Carolina.   

MIT entered the event ranked as the top team in the country. Overall, MIT won three individual national titles and four relay titles. The head coach, Meg Sisson French, was named the College Swimming and Diving Coaches Association of America Women’s Swim Coach of the Year. 

On day 1 of the championships, the 400 Medley Relay team of senior Kate Augustyn (Eau Claire, Wisconsin), first-year Sarah Bernard (Brookline, Massachusetts), sophomore Sydney Smith (Atlanta, Georgia), and graduate student Alexandra Turvey (Vancouver, British Colombia) touched the wall first in 3:38.48, just beating the NYU team by 0.8 second and setting a new school record. 

Day 2 highlights included Smith posting a winning time of 53.96 in the 100 fly, beating out Nicole Ranile of NYU by under a second. The 200 freestyle relay team of Turvey, Smith, sophomore Ella Roberson (Midland, Michigan) and junior Annika Naveen (Wynnewood, Pennsylvania) held off Pomona-Pitzer for the gold as Naveen brought the title home and gave the Engineers a national record time of 1:30.00. 

MIT opened day 3 with another national title, this time in the 200 medley relay. Augustyn led off, followed by Bernard and Naveen. Ella Roberson brought the title home for MIT as she completed her anchor leg in 22.02, which gave the team a combined time of 1:39.51. Roberson was able to hold off a late charge by Kenyon College, which finished second in 1:40.26 as the Engineers set another national record. Augustyn later defended her title in the 100 backstroke as she clocked in with a time of 53.41, tying her own national record. 

The final day of action saw MIT pull ahead of NYU with two more national titles. In the 200 backstroke, Augustyn held the lead through most of the event, but Sophia Verkleeren of Williams College caught up to the defending champion in the last half of the race. With just 25 yards left, Augustyn pulled away to defeat Verkleeren with a time of 1:55.85. Augustyn shaved almost 2 seconds off her preliminary time and fell just short of the national record time of 1:55.67. With the win, the Engineers pulled to within one point of NYU for the top spot. 

The Engineers sealed the overall national championship by winning their fourth relay of the championship, besting the team from NYU. Turvey set the pace with her lead-off, followed by Smith and Augustyn. Roberson, swimming the anchor leg, held off Kaley McIntyre of NYU, who earlier set the national record in the 100 freestyle, to give MIT the win with a time of 3:19.03 as the Violets took second in 3:19.36.   

Augustyn defended her title in the 200 backstroke while sweeping the National Championship in both the 100 and 200 backstroke in consecutive years. She concludes her career as one of the most decorated swimmers in program history, collecting four individual national championships, four relay national championships, and 27 all-America honors, the most in program history. 

A new way to make graphs more accessible to blind and low-vision readers

Tue, 03/25/2025 - 1:20pm

Bar graphs and other charts provide a simple way to communicate data, but are, by definition, difficult to translate for readers who are blind or low-vision. Designers have developed methods for converting these visuals into “tactile charts,” but guidelines for doing so are extensive (for example, the Braille Authority of North America’s 2022 guidebook is 426 pages long). The process also requires understanding different types of software, as designers often draft their chart in programs like Adobe Illustrator and then translate it into Braille using another application.

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have now developed an approach that streamlines the design process for tactile chart designers. Their program, called “Tactile Vega-Lite,” can take data from something like an Excel spreadsheet and turn it into both a standard visual chart and a touch-based one. Design standards are hardwired as default rules within the program to help educators and designers automatically create accessible tactile charts.

The tool could make it easier for blind and low-vision readers to understand many graphics, such as a bar chart comparing minimum wages across states or a line graph tracking countries’ GDPs over time. To bring your designs to the real world, you can tweak your chart in Tactile Vega-Lite and then send its file to a Braille embosser (which prints text as readable dots).

This spring, the researchers will present Tactile Vega-Lite in a paper at the Association of Computing Machinery Conference on Human Factors in Computing Systems. According to lead author Mengzhu “Katie” Chen SM ’25, the tool strikes a balance between the precision that design professionals want for editing and the efficiency educators need to create tactile charts quickly.

“We interviewed teachers who wanted to make their lessons accessible to blind and low-vision students, and designers experienced in putting together tactile charts,” says Chen, a recent CSAIL affiliate and master's graduate in electrical engineering and computer science and the Program in System Design and Management. “Since their needs differ, we designed a program that’s easy to use, provides instant feedback when you want to make tweaks, and implements accessibility guidelines.”

Data you can feel

The researchers’ program builds off of their 2017 visualization tool Vega-Lite by automatically encoding both a flat, standard chart and a tactile one. Senior author and MIT postdoc Jonathan Zong SM ’20, PhD ’24 points out that the program makes intuitive design decisions so users don’t have to.

“Tactile Vega-Lite has smart defaults to ensure proper spacing, layout, and texture and Braille conversion, following best practices to create good touch-based reading experiences,” says Zong, who is also a fellow at the Berkman Klein Center for Internet and Society at Harvard University and an incoming assistant professor at the University of Colorado. “Building on existing guidelines and our interviews with experts, the goal is for teachers or visual designers without a lot of tactile design expertise to quickly convey data in a clear way for tactile readers to explore and understand.”

Tactile Vega-Lite’s code editor allows users to customize axis labels, tick marks, and other elements. Different features within the chart are represented by abstractions — or summaries of a longer body of code — that can be modified. These shortcuts allow you to write brief phrases that tweak the design of your chart. For example, if you want to change how the bars in your graph are filled out, you could change the code in the “Texture” section from “dottedFill” to “verticalFill” to replace small circles with upward lines.

To understand how these abstractions work, the researchers added a gallery of examples. Each one includes a phrase and what change that code leads to. Still, the team is looking to refine Tactile Vega-Lite’s user interface to make it more accessible to users less familiar with coding. Instead of using abstractions for edits, you could click on different buttons.

Chen says she and her colleagues are hoping to add machine-specific customizations to their program. This would allow users to preview how their tactile chart would look before it’s fabricated by an embossing machine and make edits according to the device’s specifications.

While Tactile Vega-Lite can streamline the many steps it usually takes to make a tactile chart, Zong emphasizes that it doesn’t replace an expert doing a final check-over for guideline compliance. The researchers are continuing to incorporate Braille design rules into their program, but caution that human review will likely remain the best practice.

“The ability to design tactile graphics efficiently, particularly without specialized software, is important for providing equal access of information to tactile readers,” says Stacy Fontenot, owner of Font to Dot, who wasn’t involved in the research. “Graphics that follow current guidelines and standards are beneficial for the reader as consistency is paramount, especially with complex, data-filled graphics. Tactile Vega-Lite has a straightforward interface for creating informative tactile graphics quickly and accurately, thereby reducing the design time in providing quality graphics to tactile readers.”

Chen and Zong wrote the paper with Isabella Pineros ’23, MEng ’24 and MIT Associate Professor Arvind Satyanarayan. The researchers’ work was supported by a National Science Foundation grant.

The CSAIL team also incorporated input from Rich Caloggero from MIT’s Disability and Access Services, as well as the Lighthouse for the Blind, which let them observe technical design workflows as part of the project.

Technology developed by MIT engineers makes pesticides stick to plant leaves

Tue, 03/25/2025 - 10:00am

Reducing the amount of agricultural sprays used by farmers — including fertilizers, pesticides and herbicides — could cut down the amount of polluting runoff that ends up in the environment while at the same time reducing farmers’ costs and perhaps even enhancing their productivity. A classic win-win-win.

A team of researchers at MIT and a spinoff company they launched has developed a system to do just that. Their technology adds a thin coating around droplets as they are being sprayed onto a field, greatly reducing their tendency to bounce off leaves and end up wasted on the ground. Instead, the coated droplets stick to the leaves as intended.

The research is described today in the journal Soft Matter, in a paper by recent MIT alumni Vishnu Jayaprakash PhD ’22 and Sreedath Panat PhD ’23, graduate student Simon Rufer, and MIT professor of mechanical engineering Kripa Varanasi.

A recent study found that if farmers didn’t use pesticides, they would lose 78 percent of fruit, 54 percent of vegetable, and 32 percent of cereal production. Despite their importance, a lack of technology that monitors and optimizes sprays has forced farmers to rely on personal experience and rules of thumb to decide how to apply these chemicals. As a result, these chemicals tend to be over-sprayed, leading to runoff and chemicals ending up in waterways or building up in the soil.

Pesticides take a significant toll on global health and the environment, the researchers point out. A recent study found that 31 percent of agricultural soils around the world were at high risk from pesticide pollution. And agricultural chemicals are a major expense for farmers: In the U.S., they spend $16 billion a year just on pesticides.

Making spraying more efficient is one of the best ways to make food production more sustainable and economical. Agricultural spraying essentially boils down to mixing chemicals into water and spraying water droplets onto plant leaves, which are often inherently water-repellent. “Over more than a decade of research in my lab at MIT, we have developed fundamental understandings of spraying and the interaction between droplets and plants — studying when they bounce and all the ways we have to make them stick better and enhance coverage,” Varanasi says.

The team had previously found a way to reduce the amount of sprayed liquid that bounces away from the leaves it strikes, which involved using two spray nozzles instead of one and spraying mixtures with opposite electrical charges. But they found that farmers were reluctant to take on the expense and effort of converting their spraying equipment to a two-nozzle system. So, the team looked for a simpler alternative.

They discovered they could achieve the same improvement in droplet retention using a single-nozzle system that can be easily adapted to existing sprayers. Instead of giving the droplets of pesticide an electric charge, they coat each droplet with a vanishingly thin layer of an oily material.

In their new study, they conducted lab experiments with high-speed cameras. When they sprayed droplets with no special treatment onto a water-repelling (hydrophobic) surface similar to that of many plant leaves, the droplets initially spread out into a pancake-like disk, then rebounded back into a ball and bounced away. But when the researchers coated the surface of the droplets with a tiny amount of oil — making up less than 1 percent of the droplet’s liquid — the droplets spread out and then stayed put. The treatment improved the droplets’ “stickiness” by as much as a hundredfold.

“When these droplets are hitting the surface and as they expand, they form this oil ring that essentially pins the droplet to the surface,” Rufer says. The researchers tried a wide variety of conditions, he says, explaining that they conducted hundreds of experiments, “with different impact velocities, different droplet sizes, different angles of inclination, all the things that fully characterize this phenomenon.” Though different oils varied in their effectiveness, all of them were effective. “Regardless of the impact velocity and the oils, we saw that the rebound height was significantly lower,” he says.

The effect works with remarkably small amounts of oil. In their initial tests they used 1 percent oil compared to the water, then they tried a 0.1 percent, and even .01. The improvement in droplets sticking to the surface continued at a 0.1 percent, but began to break down beyond that. “Basically, this oil film acts as a way to trap that droplet on the surface, because oil is very attracted to the surface and sort of holds the water in place,” Rufer says.

In the researchers’ initial tests they used soybean oil for the coating, figuring this would be a familiar material for the farmers they were working with, many of whom were growing soybeans. But it turned out that though they were producing the beans, the oil was not part of their usual supply chain for use on the farm. In further tests, the researchers found that several chemicals that farmers were already routinely using in their spraying, called surfactants and adjuvants, could be used instead, and that some of these provided the same benefits in keeping the droplets stuck on the leaves.

“That way,” Varanasi says, “we’re not introducing a new chemical or changed chemistries into their field, but they’re using things they’ve known for a long time.”

Varanasi and Jayaprakash formed a company called AgZen to commercialize the system. In order to prove how much their coating system improves the amount of spray that stays on the plant, they first had to develop a system to monitor spraying in real time. That system, which they call RealCoverage, has been deployed on farms ranging in size from a few dozen acres to hundreds of thousands of acres, and many different crop types, and has saved farmers 30 to 50 percent on their pesticide expenditures, just by improving the controls on the existing sprays. That system is being deployed to 920,000 acres of crops in 2025, the company says, including some in California, Texas, the Midwest, France and Italy. Adding the cloaking system using new nozzles, the researchers say, should yield at least another doubling of efficiency.

“You could give back a billion dollars to U.S. growers if you just saved 6 percent of their pesticide budget,” says Jayaprakash, lead author of the research paper and CEO of AgZen. “In the lab we got 300 percent of extra product on the plant. So that means we could get orders of magnitude reductions in the amount of pesticides that farmers are spraying.”

Farmers had already been using these surfactant and adjuvant chemicals as a way to enhance spraying effectiveness, but they were mixing it with a water solution. For it to have any effect, they had to use much more of these materials, risking causing burns to the plants. The new coating system reduces the amount of these materials needed, while improving their effectiveness.

In field tests conducted by AgZen, “we doubled the amount of product on kale and soybeans just by changing where the adjuvant was,” from mixed in to being a coating, Jayaprakash says. It’s convenient for farmers because “all they’re doing is changing their nozzle. They’re getting all their existing chemicals to work better, and they’re getting more product on the plant.”

And it’s not just for pesticides. “The really cool thing is this is useful for every chemistry that’s going on the leaf, be it an insecticide, a herbicide, a fungicide, or foliar nutrition,” Varanasi says. This year, they plan to introduce the new spray system on about 30,000 acres of cropland.

Varanasi says that with projected world population growth, “the amount of food production has got to double, and we are limited in so many resources, for example we cannot double the arable land. … This means that every acre we currently farm must become more efficient and able to do more with less.” These improved spraying technologies, for both monitoring the spraying and coating the droplets, Varanasi says, “I think is fundamentally changing agriculture.”

AgZen has recently raised $10 million in venture financing to support rapid commercial deployment of these technologies that can improve the control of chemical inputs into agriculture. “The knowledge we are gathering from every leaf, combined with our expertise in interfacial science and fluid mechanics, is giving us unparalleled insights into how chemicals are used and developed — and it’s clear that we can deliver value across the entire agrochemical supply chain,” Varanasi says  “Our mission is to use these technologies to deliver improved outcomes and reduced costs for the ag industry.” 

Decoding a medieval mystery manuscript

Tue, 03/25/2025 - 12:00am

Two years ago, MIT professor of literature Arthur Bahr had one of the best days of his life. Sitting in the British Library, he was allowed to page through the Pearl-Manuscript, a singular bound volume from the 1300s containing the earliest versions of the masterly medieval poem “Pearl,” the famous tale “Sir Gawain and the Green Knight,” and two other poems.

Today, “Sir Gawain and the Green Knight” is commonly read in high school English classes. But it probably would have been lost to history without the survival of the Pearl-Manuscript, like the other works in the same volume. As it stands, no one knows who authored these texts. But one thing is clear: the surviving manuscript is a carefully crafted volume, with bespoke illustrations and the skilled use of parchment. This book is its own work of art.

“The Pearl-Manuscript is just as extraordinary and unusual and unexpected as the poems it contains,” Bahr says of the document, whose formal name is “British Library MS Cotton Nero A X/2.”

Bahr explores these ideas in a new book, “Chasing the Pearl-Manuscript: Speculation, Shapes, Delight,” published this month by the University of Chicago Press. In it, Bahr combines his deep knowledge of the volume’s texts with detailed examination of its physical qualities — thanks to technologies such as spectroscopy, which has revealed some manuscript secrets, as well as the good, old-fashioned scrutiny Bahr gave the book in person.

“My argument is that this physical object adds up to more than the sum of its parts, through its creative interplay of text, image, and materials,” Bahr says. “It is a coherent volume that evokes the concerns of the poems themselves. Most manuscripts are constructed in utilitarian ways, but not this one.”

Ode to the most beautiful poem

Bahr first encountered “Pearl” as an undergraduate at Amherst College, in a course taught by medievalist Howell D. Chickering. The poem is an intricate examination of Christian ethics; a father, whose daughter has died, dreams he is discussing the meaning of life with her.

“It is the most beautiful poem I have ever read,” Bahr says. “It blew me away, for its formal complexity, and for the really poignant human drama.” He adds: “It’s in some sense why I’m a medievalist.”

And since Bahr’s first book, “Fragments and Assemblages,” studies how medieval bound volumes were often collections of disparate documents, it was natural for him to apply this scholarly lens to the Pearl manuscript as well.

Most scholars think the Pearl manuscript has a single author — although we cannot be certain. After beginning with “Pearl,” the manuscript follows with two other poems, “Cleanness” and “Patience.” Closing the volume, “Sir Gawain and the Green Knight” is an eerie, surreal tale of courage and chivalry set in the (possibly fictional) court of King Arthur.

In the book, Bahr finds the four texts to be thematically linked, analyzing the “connective tissue” through which the “manuscript starts to cohere into a wrought, imperfect, temporally layered whole,” as he writes. Some of these links are broad, including recurring “challenges to our speculative faculties”; the works are full of seeming paradoxes and dreamscapes that test the reader’s interpretive capacity.

There are other ways the text seem aligned. “Pearl” and “Sir Gawain and the Green Knight” each have 101 stanzas. The texts have numerically consistent structures, in the case of “Pearl” based around the number 12. All but one of its stanzas has 12 lines (and Bahr suspects this imperfection is intentional, like a fine rug with a deliberate flaw, which may be the case for the “extra” 101st stanza). There are 36 lines per page. And from examining the manuscript in person, Bahr found 48 places with decorated initials, although we do not know whose.

“The more you look, the more you find,” Bahr says.

Materiality matters

Some of our knowledge about the Pearl-Manuscript is quite new: Spectroscopy has revealed that the volume originally had simple line drawings, which were later filled in with colored ink.

But there is no substitute for reading books in person. That took Bahr to London in 2023, where he was permitted an extended look at the Pearl-Manuscript in the flesh. Far from being a formality, that gave Bahr new insights.

For instance: The Pearl-Manuscript is written on parchment, which is animal skin. At a key point in the “Patience” poem, a reworking of the tale of Jonah and the whale, the parchment has been reversed, so that the “hair” side of the material faces up, rather than the “flesh” side; it is the only case of this in the manuscript.

“When you’re reading about Jonah being swallowed by the whale, you feel the hair follicles when you wouldn’t expect to,” Bahr says. “At precisely the moment when the poem is thematizing an unnatural reversal of inside and outside, you are feeling the other side of another animal.”

He adds: “The act of touching the Pearl-Manuscript really changed how I think this poem would have worked for the medieval reader.” In this vein, he says, “Materiality matters. Screens are enabling, and without the digital facsimile I could not have written this book, but they cannot ever replace the original. The ‘Patience’ chapter reinforces that.”

Ultimately, Bahr thinks the Pearl-Manuscript buttresses his view in the “Fragments and Assemblages” book, that the medieval reading experience was often bound up with the way volumes were physically constructed.

“My argument in ‘Fragments and Assemblages’ was that medieval readers and book constructors thought in a serious and often sophisticated way about how the material construction and the selection of the texts into a physical object made a difference — mattered — and had the potential to change the meanings of the texts,” he says.

Good grade on the group project

“Chasing the Pearl-Manuscript” has received praise from other scholars. Jessica Brantley, professor and chair of the English Department at Yale University, has said that Bahr “offers an adventurous multilayered reading of both text and book and provides an important reinterpretation of the codex and its poems.”

Daniel Wakelin of Oxford University has said that Bahr “sets out an authoritative reading of these poems” and presents “a bold model for studying material texts and literary works together.”

For his part, Bahr hopes to appeal to an array of readers, just as his courses on medieval literature appeal to students with an array of intellectual interests. In the making of his book, Bahr also credits two MIT students, Kelsey Glover and Madison Sneve, who helped the project through the Undergraduate Research Opportunities Program (UROP), studying the illustrations and distinctive manuscript markings, among other things.

“It’s a very MIT kind of poem in the sense that not only is the author, or authors, obsessed with math and geometry and numbers and proportion, they are also obsessed with artifact construction, with architectural details and physical craft,” Bahr says. “There’s a very ‘mens et manus’ quality to the poems that’s reflected in the manuscript,” he says, referring to MIT’s motto, “mind and hand.” “I think helps explain why these extraordinary MIT students helped me so much.”

Scene at MIT: Artfinity brings artistic celebration to campus

Tue, 03/25/2025 - 12:00am

The MIT campus came alive with artistic energy on March 13 as Artfinity — the Institute's new festival celebrating creativity and community — took over multiple venues with interactive experiences, exhibitions, and performances.

Artfinity participants created their own paths through interconnected artistic encounters across campus, exploring everything from augmented reality (AR) experiences in the Infinite Corridor to innovative musical performances at the Media Lab. The events were designed to build upon each other, allowing visitors to flow naturally between locations while experiencing a range of creative expressions.

Daytime offerings included several exhibitions: Coloring with Wide Tim at the Welcome Center; “Golden Cargo: Conquest of the Tropics” at the ACT Gallery, examining the complex history of the United Fruit Company; two exhibitions at the List Visual Arts Center — “List Projects 31: Kite” and “Pedro Gómez-Egaña: The Great Learning”; and "Mission Control" at the Media Lab. Throughout the day, the “Layers of Place” AR experience revealed hidden histories and perspectives on the pillars of Building 7, “The Alchemist” sculpture, and the Infinite Corridor.

The MIT Museum served as the hub for the evening with its After Dark series, featuring a talk on technology in art by the Media Lab’s Critical Matter group director and award-winning designer Behnaz Farahi (whose large projection on MIT's dome, “Gaze to the Stars,” was on view later that evening), alongside galleries showcasing faculty works, including Rania Ghosn's “Cosmograph,” Azra Akšamija's “Hallucinating Traditions,” and other new exhibitions featuring work from the Media Lab. Throughout the museum, visitors engaged with interactive activities ranging from flash portrait sessions to textile design.

As evening progressed, the campus transformed with performances and installations. The Media Lab hosted Moving Music, premiering two unusual works: “Here...NOW” by Ana Schon and “MAICE” by Tod Machover, a new piece for renowned marimba player Ji Hye Jung. Large-scale projections also illuminated campus buildings, including “Creative Lumens,“ where students transformed the exteriors of the new Linde Music Building, the MIT Chapel, and Zesiger Center with vibrant projections.

Additional events that evening included Argus Installation, exploring the interplay of light and hand-blown glass at the MIT Museum Studio; the Welcome Center's speed networking for artists and creatives followed by All Our Relations, where MIT's Indigenous community brought native and non-native people together for song, dance, and story; and a film screening at the Open Space Screen, offering a behind-the-scenes look at Laura Anderson Barbata's “Intervention: Ocean Blues.”

Attendance topped 1,000 on campus that evening, with many more viewing the large-scale art projections as passersby. Artfinity continues through May 2 and will have featured more than 80 free performing and visual arts events celebrating creativity and community at MIT.

Basketball analytics investment is key to NBA wins and other successes

Tue, 03/25/2025 - 12:00am

If you filled out a March Madness bracket this month, you probably faced the same question with each college match-up: What gives one team an edge over another? Is it a team’s record through the regular season? Or the chemistry among its players? Maybe it’s the experience of its coaching staff or the buzz around a top scorer.

All of these factors play some role in a team’s chance to advance. But according to a new study by MIT researchers, there’s one member who consistently boosts their team’s performance: the data analyst.

The new study, which was published this month in the Journal of Sports Economics, quantifies the influence of basketball analytics investment on team performance. The study’s authors looked in particular at professional basketball and compared the  investment in data analytics on each NBA team with the team’s record of wins over 12 seasons. They found that indeed, teams that hired more analytics staff, and invested more in data analysis in general, tended to win more games.

Analytics department headcount had a positive and statistically significant effect on team wins even when accounting for other factors such as a team’s roster salary, the experience and chemistry among its players, the consistency of its coaching staff, and player injuries through each season. Even with all of these influences, the researchers found that the depth of a team’s data analytics bench, so to speak, was a consistent predictor of the team’s wins.

What’s more, they were able to quantify basketball analytics’ value, based on their impact on team wins. They found that for every four-fifths of one data analyst, a team gains one additional win in a season. Interestingly, a team can also gain one additional win by increasing its roster salary by $9.6 million. One way to read this is that one data analyst’s impact is worth at least $9 million.

“I don’t know of any analyst who’s being paid $9 million,” says study author Henry Wang, a graduate student in the MIT Sports Lab. “There is still a gap between how the player is being valued and how the analytics are being valued."

While the study focuses on professional basketball, the researchers say the findings are relevant beyond the NBA. They speculate that college teams that make use of data analytics may have an edge over those who don’t. (Take note, March Madness fans.) And the same likely goes for sports in general, along with any competitive field.

“This paper hits nicely not just in sports but beyond, with this question of: What is the tangible impact of big data analytics?” says co-author Arnab Sarker PhD ’25, a recent doctoral graduate of MIT’s Institute for Data, Systems and Society (IDSS). “Sports are a really nice, controlled place for analytics. But we’re also curious to what extent we can see these effects in general organizational performance.”

The study is also co-authored by Anette “Peko” Hosoi, the Pappalardo Professor of Mechanical Engineering at MIT.

Data return

Across the sports world, data analysts have grown in number and scope over the years. Sports analytics’ role in using data and stats to improve team performance was popularized in 2011 with the movie “Moneyball,” based on the 2003 book “Moneyball: The Art of Winning an Unfair Game” by Michael Lewis, who chronicled the 2002 Oakland Athletics and general manager Billy Beane’s use of baseball analytics to win games against wealthier Major League Baseball teams.

Since then, data analysis has expanded to many other sports, in an effort to make use of the varied and fast-paced sources of data, measurements, and statistics available today. In basketball, analysts can take on many roles, using data, for instance, to optimize a player’s health and minimize injury risk, and to predict a player’s performance to inform draft selection, free agency acquisition, and contract negotiations.

A data analyst’s work can also influence in-game strategy. Case in point: Over the last decade, NBA teams have strategically chosen to shift to shooting longer-range three-pointers, since Philadelphia 76ers President of Basketball Operations Daryl Morey SM ’00 determined that statistically, shooting more three-pointers wins more games. Today, each of the 30 NBA teams employs at least one basketball analytics staffer. And yet, while a data analyst’s job is entirely based on data, there is not much data on the impact of analysts themselves.

“Teams and leagues are spending millions of dollars on embracing analytical tools without a real sense of return-on-investment,” Wang notes.

Numbers value

The MIT researchers aimed in their new study to quantify the influence of NBA team analysts, specifically on winning games. To do so, they looked to major sources of sports data such as ESPN.com, and NBAstuffer.com, a website that hosts a huge amount of stats on NBA games and team stats, including hired basketball analytics staff, that the website’s managers compile based on publicly available data, such as from official team press releases and staff directories, as well as LinkedIn and X profiles, and news and industry reports.

For their new study, Wang and his colleagues gathered data on each of the 30 NBA teams, over a period from 2009 to 2023, 2009 being the year that NBAstuffer.com started compiling team data. For every team in each season during this period, the researchers recorded an “analyst headcount,” meaning the number of basketball operations analytics staff employed by a team. They considered an analyst to be data analysts, software engineers, sports scientists, directors of research, and other technical positions by title, but also staff members who aren’t formally analysts but may be known to be particularly active in the basketball analytics community. In general, they found that in 2009, a total of 10 data analysts were working across the NBA. In 2023, that number ballooned to 132, with some teams employing more analysts than others.

“What we’re trying to measure is a team’s level of investment in basketball analytics,” Wang explains. “The best measure would be if every team told us exactly how much money they spent every year on their R&D and data infrastructure and analysts. But they’re not going to do that. So headcount is the next best thing.”

In addition to analytics headcount, the researchers also compiled data on other win-influencing variables, such as roster salary (Does a higher-paid team win more games?), roster experience (Does a team with more veterans win more games?), consistent coaching (Did a new coach shake up a team’s win record?) and season injuries (How did a team’s injuries affect its wins?). The researchers also noted “road back-to-backs,” or the number of times a team had to play consecutive away games (Does the wear and tear of constant travel impact wins?).

The researchers plugged all this data into a “two-way fixed effects” model to estimate the relative effect that each variable has on the number of additional games a team can win in a season.

“The model learns all these effects, so we can see, for instance, the tradeoff between analyst and roster salary when contributing to win total,” Wang explains.

Their finding that teams with a higher analytics headcount tended to win more games wasn’t entirely surprising.

“We’re still at a point where the analyst is undervalued,” Wang says. “There probably is a sweet spot, in terms of headcount and wins. You can’t hire 100 analysts and expect to go in 82-and-0 next season. But right now a lot of teams are still below that sweet spot, and this competitive advantage that analytics offers has yet to be fully harvested.”

Mathematicians uncover the logic behind how people walk in crowds

Mon, 03/24/2025 - 3:00pm

Next time you cross a crowded plaza, crosswalk, or airport concourse, take note of the pedestrian flow. Are people walking in orderly lanes, single-file, to their respective destinations? Or is it a haphazard tangle of personal trajectories, as people dodge and weave through the crowd?

MIT instructor Karol Bacik and his colleagues studied the flow of human crowds and developed a first-of-its-kind way to predict when pedestrian paths will transition from orderly to entangled. Their findings may help inform the design of public spaces that promote safe and efficient thoroughfares.

In a paper appearing this week in the Proceedings of the National Academy of Sciences, the researchers consider a common scenario in which pedestrians navigate a busy crosswalk. The team analyzed the scenario through mathematical analysis and simulations, considering the many angles at which individuals may cross and the dodging maneuvers they may make as they attempt to reach their destinations while avoiding bumping into other pedestrians along the way.

The researchers also carried out controlled crowd experiments and studied how real participants walked through a crowd to reach certain locations. Through their mathematical and experimental work, the team identified a key measure that determines whether pedestrian traffic is ordered, such that clear lanes form in the flow, or disordered, in which there are no discernible paths through the crowd. Called “angular spread,” this parameter describes the number of people walking in different directions.

If a crowd has a relatively small angular spread, this means that most pedestrians walk in opposite directions and meet the oncoming traffic head-on, such as in a crosswalk. In this case, more orderly, lane-like traffic is likely. If, however, a crowd has a larger angular spread, such as in a concourse, it means there are many more directions that pedestrians can take to cross, with more chance for disorder.

In fact, the researchers calculated the point at which a moving crowd can transition from order to disorder. That point, they found, was an angular spread of around 13 degrees, meaning that if pedestrians don’t walk straight across, but instead an average pedestrian veers off at an angle larger than 13 degrees, this can tip a crowd into disordered flow.

“This all is very commonsense,” says Bacik, who is a instructor of applied mathematics at MIT. “The question is whether we can tackle it precisely and mathematically, and where the transition is. Now we have a way to quantify when to expect lanes — this spontaneous, organized, safe flow — versus disordered, less efficient, potentially more dangerous flow.”

The study’s co-authors include Grzegorz Sobota and Bogdan Bacik of the Academy of Physical Education in Katowice, Poland, and Tim Rogers at the University of Bath in the United Kingdom.

Right, left, center

Bacik, who is trained in fluid dynamics and granular flow, came to study pedestrian flow during 2021, when he and his collaborators looked into the impacts of social distancing, and ways in which people might walk among each other while maintaining safe distances. That work inspired them to look more generally into the dynamics of crowd flow.

In 2023, he and his collaborators explored “lane formation,” a phenomenon by which particles, grains, and, yes, people have been observed to spontaneously form lanes, moving in single-file when forced to cross a region from two opposite directions. In that work, the team identified the mechanism by which such lanes form, which Bacik sums up as “an imbalance of turning left versus right.” Essentially, they found that as soon as something in a crowd starts to look like a lane, individuals around that fledgling lane either join up, or are forced to either side of it, walking parallel to the original lane, which others can follow. In this way, a crowd can spontaneously organize into regular, structured lanes.

“Now we’re asking, how robust is this mechanism?” Bacik says. “Does it only work in this very idealized situation, or can lane formation tolerate some imperfections, such as some people not going perfectly straight, as they might do in a crowd?”

Lane change

For their new study, the team looked to identify a key transition in crowd flow: When do pedestrians switch from orderly, lane-like traffic, to less organized, messy flow? The researchers first probed the question mathematically, with an equation that is typically used to describe fluid flow, in terms of the average motion of many individual molecules.

“If you think about the whole crowd flowing, rather than individuals, you can use fluid-like descriptions,” Bacik explains. “It’s this art of averaging, where, even if some people may cross more assertively than others, these effects are likely to average out in a sufficiently large crowd. If you only care about the global characteristics like, are there lanes or not, then you can make predictions without detailed knowledge of everyone in the crowd.”

Bacik and his colleagues used equations of fluid flow, and applied them to the scenario of pedestrians flowing across a crosswalk. The team tweaked certain parameters in the equation, such as the width of the fluid channel (in this case, the crosswalk), and the angle at which molecules (or people) flowed across, along with various directions that people can “dodge,” or move around each other to avoid colliding.

Based on these calculations, the researchers found that pedestrians in a crosswalk are more likely to form lanes, when they walk relatively straight across, from opposite directions. This order largely holds until people start veering across at more extreme angles. Then, the equation predicts that the pedestrian flow is likely to be disordered, with few to no lanes forming.

The researchers were curious to see whether the math bears out in reality. For this, they carried out experiments in a gymnasium, where they recorded the movements of pedestrians using an overhead camera. Each volunteer wore a paper hat, depicting a unique barcode that the overhead camera could track.

In their experiments, the team assigned volunteers various start and end positions along opposite sides of a simulated crosswalk, and tasked them with simultaneously walking across the crosswalk to their target location without bumping into anyone. They repeated the experiment many times, each time having volunteers assume different start and end positions. In the end, the researchers were able to gather visual data of multiple crowd flows, with pedestrians taking many different crossing angles.

When they analyzed the data and noted when lanes spontaneously formed, and when they did not, the team found that, much like the equation predicted, the angular spread mattered. Their experiments confirmed that the transition from ordered to disordered flow occurred somewhere around the theoretically predicted 13 degrees. That is, if an average person veered more than 13 degrees away from straight ahead, the pedestrian flow could tip into disorder, with little lane formation. What’s more, they found that the more disorder there is in a crowd, the less efficiently it moves.

The team plans to test their predictions on real-world crowds and pedestrian thoroughfares.

“We would like to analyze footage and compare that with our theory,” Bacik says. “And we can imagine that, for anyone designing a public space, if they want to have a safe and efficient pedestrian flow, our work could provide a simpler guideline, or some rules of thumb.”

This work is supported, in part, by the Engineering and Physical Sciences Research Council of UK Research and Innovation.

Biogen to consolidate operations in MIT’s first Kendall Common building

Mon, 03/24/2025 - 7:30am

Over the course of nearly five decades, Biogen has played a major role in catalyzing and shaping Kendall Square in Cambridge, Massachusetts, now heralded as the “most innovative square mile on the planet.” Today, Biogen announced its decision to centralize operations in a new facility at 75 Broadway in MIT’s Kendall Common development. The move, which will take place in 2028, highlights the company’s commitment to Cambridge and the regional innovation ecosystem — a wellspring of biomedical advances.

“It’s fitting that Biogen — a company with such close ties to people at MIT — will make Kendall Common’s first building its new home,” says MIT President Sally Kornbluth. “The motto of Kendall Square might as well be ‘talent in proximity’ and Biogen’s decision to intensify its presence here promises great things for the whole ecosystem. To achieve this milestone on the occasion of the company’s 50th anniversary is especially meaningful. We are grateful to Chris Viehbacher, president and chief executive officer of Biogen, for his keen vision of the future and his ongoing commitment to Cambridge and Kendall Square.”

The approximately 580,000-square-foot facility will integrate Biogen’s research and development teams together with its global and North American commercialization organizations. The building will incorporate advanced conservation, efficiency, and sustainable design elements.

“Biogen’s story in Kendall Square is unlike any other,” says Anantha Chandrakasan, MIT’s chief innovation and strategy officer. “Institute Professor Phil Sharp’s early work in genetics and molecular biology and his co-founding of Biogen in 1978 set life sciences on a bold trajectory in the region — and in the world. MIT’s intertwined history with Biogen has benefited society through significant research advancements — from classroom and lab to market — in treating multiple sclerosis, Parkinson’s disease, and other neuromuscular disorders. I’m so pleased that our fruitful partnership will continue.”

The new building, designed by Elkus-Manfredi Architects, will activate the corner at 75 Broadway, and protect and accentuate the abutting 6th Street Walkway — a favorite tree-lined path for residents and Kendall employees alike. A joint venture partnership between the MIT Investment Management Company and BioMed Realty, a Blackstone Real Estate portfolio company, is facilitating advancement of the project.

“Helping to ensure that Biogen stays in Cambridge was very important to us,” says Patrick Rowe, senior vice president in MIT’s real estate group, which is part of the Institute’s investment management company. “The company’s nearly 50-year history is a foundational component of the Kendall Square innovation ecosystem.”

“We are thrilled to partner with MIT in the development and activation of this world-class lab and office asset in the heart of Kendall Square,” says Bill Kane, BioMed Realty’s president of East Coast and U.K. markets. “75 Broadway will provide mission-critical infrastructure to Biogen that enables the development of the next generation of life-saving medicines and therapies.”

Ultimately, the 10-acre Kendall Common development will include eight buildings for residential, office, lab, retail, and community uses. The project’s 10-year review process and federal agreement led to the recent opening of the MIT-built John A. Volpe National Transportation Systems Center.

MIT scientists engineer starfish cells to shape-shift in response to light

Mon, 03/24/2025 - 6:00am

Life takes shape with the motion of a single cell. In response to signals from certain proteins and enzymes, a cell can start to move and shake, leading to contractions that cause it to squeeze, pinch, and eventually divide. As daughter cells follow suit down the generational line, they grow, differentiate, and ultimately arrange themselves into a fully formed organism.

Now MIT scientists have used light to control how a single cell jiggles and moves during its earliest stage of development. The team studied the motion of egg cells produced by starfish — an organism that scientists have long used as a classic model for understanding cell growth and development.

The researchers focused on a key enzyme that triggers a cascade of motion within a starfish egg cell. They genetically designed a light-sensitive version of the same enzyme, which they injected into egg cells, and then stimulated the cells with different patterns of light.

They found that the light successfully triggered the enzyme, which in turn prompted the cells to jiggle and move in predictable patterns. For instance, the scientists could stimulate cells to exhibit small pinches or sweeping contractions, depending on the pattern of light they induced. They could even shine light at specific points around a cell to stretch its shape from a circle to a square.

Their results, appearing today in the journal Nature Physics, provide scientists with a new optical tool for controlling cell shape in its earliest developmental stages. Such a tool, they envision, could guide the design of synthetic cells, such as therapeutic “patch” cells that contract in response to light signals to help close wounds, or drug-delivering “carrier” cells that release their contents only when illuminated at specific locations in the body. Overall, the researchers see their findings as a new way to probe how life takes shape from a single cell.

“By revealing how a light-activated switch can reshape cells in real time, we’re uncovering basic design principles for how living systems self-organize and evolve shape,” says the study’s senior author, Nikta Fakhri, associate professor of physics at MIT. “The power of these tools is that they are guiding us to decode all these processes of growth and development, to help us understand how nature does it.”

The study’s MIT authors include first author Jinghui Liu, Yu-Chen Chao, and Tzer Han Tan; along with Tom Burkart, Alexander Ziepke, and Erwin Frey of Ludwig Maximilian University of Munich; John Reinhard of Saarland University; and S. Zachary Swartz of the Whitehead Institute for Biomedical Research.

Cell circuitry

Fakhri’s group at MIT studies the physical dynamics that drive cell growth and development. She is particularly interested in symmetry, and the processes that govern how cells follow or break symmetry as they grow and divide. The five-limbed starfish, she says, is an ideal organism for exploring such questions of growth, symmetry, and early development.

“A starfish is a fascinating system because it starts with a symmetrical cell and becomes a bilaterally symmetric larvae at early stages, and then develops into pentameral adult symmetry,” Fakhri says. “So there’s all these signaling processes that happen along the way to tell the cell how it needs to organize.”

Scientists have long studied the starfish and its various stages of development. Among many revelations, researchers have discovered a key “circuitry” within a starfish egg cell that controls its motion and shape. This circuitry involves an enzyme, GEF, that naturally circulates in a cell’s cytoplasm. When this enzyme is activated, it induces a change in a protein, called Rho, that is known to be essential for regulating cell mechanics.

When the GEF enzyme stimulates Rho, it causes the protein to switch from an essentially free-floating state to a state that binds the protein to the cell’s membrane. In this membrane-bound state, the protein then triggers the growth of microscopic, muscle-like fibers that thread out across the membrane and subsequently twitch, enabling the cell to contract and move. 

In previous work, Fakhri’s group showed that a cell’s movements can be manipulated by varying the cell’s concentrations of GEF enzyme: The more enzyme they introduced into a cell, the more contractions the cell would exhibit.

“This whole idea made us think whether it’s possible to hack this circuitry, to not just change a cell’s pattern of movements but get a desired mechanical response,” Fakhri says.

Lights and action

To precisely manipulate a cell’s movements, the team looked to optogenetics — an approach that involves genetically engineering cells and cellular components such as proteins and enzymes, such that they activate in response to light.

Using established optogenetic techniques, the researchers developed a light-sensitive version of the GEF enzyme. From this engineered enzyme, they isolated its mRNA — essentially, the genetic blueprint for building the enzyme. They then injected this blueprint into egg cells that the team harvested from a single starfish ovary, which can hold millions of unfertilized cells. The cells, infused with the new mRNA, then began to produce light-sensitive GEF enzymes on their own.

In experiments, the researchers then placed each enzyme-infused egg cell under a microscope and shone light onto the cell in different patterns and from different points along the cell’s periphery. They took videos of the cell’s movements in response.

They found that when they aimed the light in specific points, the GEF enzyme became activated and recruited Rho protein to the light-targeted sites. There, the protein then set off its characteristic cascade of muscle-like fibers that pulled or pinched the cell in the same, light-stimulated spots. Much like pulling the strings of a marionette, they were able to control the cell’s movements, for instance directing it to morph into various shapes, including a square.

Surprisingly, they also found they could stimulate the cell to undergo sweeping contractions by shining a light in a single spot, exceeding a certain threshold of enzyme concentration.

“We realized this Rho-GEF circuitry is an excitable system, where a small, well-timed stimulus can trigger a large, all-or-nothing response,” Fakhri says. “So we can either illuminate the whole cell, or just a tiny place on the cell, such that enough enzyme is recruited to that region so the system gets kickstarted to contract or pinch on its own.”

The researchers compiled their observations and derived a theoretical framework to predict how a cell’s shape will change, given how it is stimulated with light. The framework, Fakhri says, opens a window into “the ‘excitability’ at the heart of cellular remodeling, which is a fundamental process in embryo development and wound healing.”

She adds: “This work provides a blueprint for designing ‘programmable’ synthetic cells, letting researchers orchestrate shape changes at will for future biomedical applications.”

This work was supported, in part, by the Sloan Foundation, and the National Science Foundation.

Engineers develop a better way to deliver long-lasting drugs

Mon, 03/24/2025 - 6:00am

MIT engineers have devised a new way to deliver certain drugs in higher doses with less pain, by injecting them as a suspension of tiny crystals. Once under the skin, the crystals assemble into a drug “depot” that could last for months or years, eliminating the need for frequent drug injections.

This approach could prove useful for delivering long-lasting contraceptives or other drugs that need to be given for extended periods of time. Because the drugs are dispersed in a suspension before injection, they can be administered through a narrow needle that is easier for patients to tolerate.

“We showed that we can have very controlled, sustained delivery, likely for multiple months and even years through a small needle,” says Giovanni Traverso, an associate professor of mechanical engineering at MIT, a gastroenterologist at Brigham and Women’s Hospital (BWH), an associate member of the Broad Institute, and the senior author of the study.

The lead authors of the paper, which appears today in Nature Chemical Engineering, are former MIT and BWH postdoc Vivian Feig, who is now an assistant professor of mechanical engineering at Stanford University; MIT graduate student Sanghyun Park; and Pier Rivano, a former visiting research scholar in Traverso’s lab.

Easier injections

This project began as part of an effort funded by the Gates Foundation to expand contraceptive options, particularly in developing nations.

“The overarching goal is to give women access to a lot of different formats for contraception that are easy to administer, compatible with being used in the developing world, and have a range of different timeframes of durations of action,” Feig says. “In our particular project, we were interested in trying to combine the benefits of long-acting implants with the ease of self-administrable injectables.”

There are marketed injectable suspensions available in the United States and other countries, but these drugs are dispersed throughout the tissue after injection, so they only work for about three months. Other injectable products have been developed that can form longer-lasting depots under the skin, but these typically require the addition of precipitating polymers that can make up 23 to 98 percent of the solution by weight, which can make the drug more difficult to inject.

The MIT and BWH team wanted to create a formulation that could be injected through a small-gauge needle and last for at least six months and up to two years. They began working with a contraceptive drug called levonorgestrel, a hydrophobic molecule that can form crystals. The team discovered that suspending these crystals in a particular organic solvent caused the crystals to assemble into a highly compact implant after injection. Because this depot could form without needing large amounts of polymer, the drug formulation could still be easily injected through a narrow-gauge needle.

The solvent, benzyl benzoate, is biocompatible and has been previously used as an additive to injectable drugs. The team found that the solvent’s poor ability to mix with biological fluids is what allows the solid drug crystals to self-assemble into a depot under the skin after injection.

“The solvent is critical because it allows you to inject the fluid through a small needle, but once in place, the crystals self-assemble into a drug depot,” Traverso says.

By altering the density of the depot, the researchers can tune the rate at which the drug molecules are released into the body. In this study, the researchers showed they could change the density by adding small amounts of a polymer such as polycaprolactone, a biodegradable polyester.

“By incorporating a very small amount of polymers — less than 1.6 percent by weight — we can modulate the drug release rate, extending its duration while maintaining injectability. This demonstrates the tunability of our system, which can be engineered to accommodate a broader range of contraceptive needs as well as tailored dosing regimens for other therapeutic applications,” Park says.

Stable drug depots

The researchers tested their approach by injecting the drug solution subcutaneously in rats and showed that the drug depots could remain stable and release drug gradually for three months. After the three-month study ended, about 85 percent of the drug remained in the depots, suggesting that they could continue releasing the drugs for a much longer period of time.

“We anticipate that the depots could last for more than a year, based on our post-analysis of preclinical data. Follow-up studies are underway to further validate their efficacy beyond this initial proof-of-concept,” Park says.

Once the drug depots form, they are compact enough to be retrievable, allowing for surgical removal if treatment needs to be halted before the drug is fully released.

This approach could also lend itself to delivering drugs to treat neuropsychiatric conditions as well as HIV and tuberculosis, the researchers say. They are now moving toward assessing its translation to humans by conducting advanced preclinical studies to evaluate self-assembly in a more clinically relevant skin environment. “This is a very simple system in that it’s basically a solvent, the drug, and then you can add a little bit of bioresorbable polymer. Now we’re considering which indications do we go after: Is it contraception? Is it others? These are some of the things that we’re starting to look into as part of the next steps toward translation to humans,” Traverso says.

The research was funded, in part, by the Gates Foundation, the Karl van Tassel Career Development Professorship, the MIT Department of Mechanical Engineering, a Schmidt Science Fellows postdoctoral fellowship, the Rhodes Trust, a Takeda Fellowship, a Warren M. Rohsenow Fellowship, and a Kwangjeong Educational Foundation Fellowship.

Device enables direct communication among multiple quantum processors

Fri, 03/21/2025 - 6:00am

Quantum computers have the potential to solve complex problems that would be impossible for the most powerful classical supercomputer to crack.

Just like a classical computer has separate, yet interconnected, components that must work together, such as a memory chip and a CPU on a motherboard, a quantum computer will need to communicate quantum information between multiple processors.

Current architectures used to interconnect superconducting quantum processors are “point-to-point” in connectivity, meaning they require a series of transfers between network nodes, with compounding error rates.

On the way to overcoming these challenges, MIT researchers developed a new interconnect device that can support scalable, “all-to-all” communication, such that all superconducting quantum processors in a network can communication directly with each other.

They created a network of two quantum processors and used their interconnect to send microwave photons back and forth on demand in a user-defined direction. Photons are particles of light that can carry quantum information.

The device includes a superconducting wire, or waveguide, that shuttles photons between processors and can be routed as far as needed. The researchers can couple any number of modules to it, efficiently transmitting information between a scalable network of processors.

They used this interconnect to demonstrate remote entanglement, a type of correlation between quantum processors that are not physically connected. Remote entanglement is a key step toward developing a powerful, distributed network of many quantum processors.

“In the future, a quantum computer will probably need both local and nonlocal interconnects. Local interconnects are natural in arrays of superconducting qubits. Ours allows for more nonlocal connections. We can send photons at different frequencies, times, and in two propagation directions, which gives our network more flexibility and throughput,” says Aziza Almanakly, an electrical engineering and computer science graduate student in the Engineering Quantum Systems group of the Research Laboratory of Electronics (RLE) and lead author of a paper on the interconnect.

Her co-authors include Beatriz Yankelevich, a graduate student in the EQuS Group; senior author William D. Oliver, the Henry Ellis Warren (1894) Professor of Electrical Engineering and Computer Science (EECS) and professor of Physics, director of the Center for Quantum Engineering, and associate director of RLE; and others at MIT and Lincoln Laboratory. The research appears today in Nature Physics.

A scalable architecture

The researchers previously developed a quantum computing module, which enabled them to send information-carrying microwave photons in either direction along a waveguide.

In the new work, they took that architecture a step further by connecting two modules to a waveguide in order to emit photons in a desired direction and then absorb them at the other end.

Each module is composed of four qubits, which serve as an interface between the waveguide carrying the photons and the larger quantum processors.

The qubits coupled to the waveguide emit and absorb photons, which are then transferred to nearby data qubits.

The researchers use a series of microwave pulses to add energy to a qubit, which then emits a photon. Carefully controlling the phase of those pulses enables a quantum interference effect that allows them to emit the photon in either direction along the waveguide. Reversing the pulses in time enables a qubit in another module any arbitrary distance away to absorb the photon.

“Pitching and catching photons enables us to create a ‘quantum interconnect’ between nonlocal quantum processors, and with quantum interconnects comes remote entanglement,” explains Oliver.

“Generating remote entanglement is a crucial step toward building a large-scale quantum processor from smaller-scale modules. Even after that photon is gone, we have a correlation between two distant, or ‘nonlocal,’ qubits. Remote entanglement allows us to take advantage of these correlations and perform parallel operations between two qubits, even though they are no longer connected and may be far apart,” Yankelevich explains.

However, transferring a photon between two modules is not enough to generate remote entanglement. The researchers need to prepare the qubits and the photon so the modules “share” the photon at the end of the protocol.

Generating entanglement

The team did this by halting the photon emission pulses halfway through their duration. In quantum mechanical terms, the photon is both retained and emitted. Classically, one can think that half-a-photon is retained and half is emitted.

Once the receiver module absorbs that “half-photon,” the two modules become entangled.

But as the photon travels, joints, wire bonds, and connections in the waveguide distort the photon and limit the absorption efficiency of the receiving module.

To generate remote entanglement with high enough fidelity, or accuracy, the researchers needed to maximize how often the photon is absorbed at the other end.

“The challenge in this work was shaping the photon appropriately so we could maximize the absorption efficiency,” Almanakly says.

They used a reinforcement learning algorithm to “predistort” the photon. The algorithm optimized the protocol pulses in order to shape the photon for maximal absorption efficiency.

When they implemented this optimized absorption protocol, they were able to show photon absorption efficiency greater than 60 percent.

This absorption efficiency is high enough to prove that the resulting state at the end of the protocol is entangled, a major milestone in this demonstration.

“We can use this architecture to create a network with all-to-all connectivity. This means we can have multiple modules, all along the same bus, and we can create remote entanglement among any pair of our choosing,” Yankelevich says.

In the future, they could improve the absorption efficiency by optimizing the path over which the photons propagate, perhaps by integrating modules in 3D instead of having a superconducting wire connecting separate microwave packages. They could also make the protocol faster so there are fewer chances for errors to accumulate.

“In principle, our remote entanglement generation protocol can also be expanded to other kinds of quantum computers and bigger quantum internet systems,” Almanakly says.

This work was funded, in part, by the U.S. Army Research Office, the AWS Center for Quantum Computing, and the U.S. Air Force Office of Scientific Research. 

AI tool generates high-quality images faster than state-of-the-art approaches

Fri, 03/21/2025 - 12:00am

The ability to generate high-quality images quickly is crucial for producing realistic simulated environments that can be used to train self-driving cars to avoid unpredictable hazards, making them safer on real streets.

But the generative artificial intelligence techniques increasingly being used to produce such images have drawbacks. One popular type of model, called a diffusion model, can create stunningly realistic images but is too slow and computationally intensive for many applications. On the other hand, the autoregressive models that power LLMs like ChatGPT are much faster, but they produce poorer-quality images that are often riddled with errors.

Researchers from MIT and NVIDIA developed a new approach that brings together the best of both methods. Their hybrid image-generation tool uses an autoregressive model to quickly capture the big picture and then a small diffusion model to refine the details of the image.

Their tool, known as HART (short for hybrid autoregressive transformer), can generate images that match or exceed the quality of state-of-the-art diffusion models, but do so about nine times faster.

The generation process consumes fewer computational resources than typical diffusion models, enabling HART to run locally on a commercial laptop or smartphone. A user only needs to enter one natural language prompt into the HART interface to generate an image.

HART could have a wide range of applications, such as helping researchers train robots to complete complex real-world tasks and aiding designers in producing striking scenes for video games.

“If you are painting a landscape, and you just paint the entire canvas once, it might not look very good. But if you paint the big picture and then refine the image with smaller brush strokes, your painting could look a lot better. That is the basic idea with HART,” says Haotian Tang SM ’22, PhD ’25, co-lead author of a new paper on HART.

He is joined by co-lead author Yecheng Wu, an undergraduate student at Tsinghua University; senior author Song Han, an associate professor in the MIT Department of Electrical Engineering and Computer Science (EECS), a member of the MIT-IBM Watson AI Lab, and a distinguished scientist of NVIDIA; as well as others at MIT, Tsinghua University, and NVIDIA. The research will be presented at the International Conference on Learning Representations.

The best of both worlds

Popular diffusion models, such as Stable Diffusion and DALL-E, are known to produce highly detailed images. These models generate images through an iterative process where they predict some amount of random noise on each pixel, subtract the noise, then repeat the process of predicting and “de-noising” multiple times until they generate a new image that is completely free of noise.

Because the diffusion model de-noises all pixels in an image at each step, and there may be 30 or more steps, the process is slow and computationally expensive. But because the model has multiple chances to correct details it got wrong, the images are high-quality.

Autoregressive models, commonly used for predicting text, can generate images by predicting patches of an image sequentially, a few pixels at a time. They can’t go back and correct their mistakes, but the sequential prediction process is much faster than diffusion.

These models use representations known as tokens to make predictions. An autoregressive model utilizes an autoencoder to compress raw image pixels into discrete tokens as well as reconstruct the image from predicted tokens. While this boosts the model’s speed, the information loss that occurs during compression causes errors when the model generates a new image.

With HART, the researchers developed a hybrid approach that uses an autoregressive model to predict compressed, discrete image tokens, then a small diffusion model to predict residual tokens. Residual tokens compensate for the model’s information loss by capturing details left out by discrete tokens.

“We can achieve a huge boost in terms of reconstruction quality. Our residual tokens learn high-frequency details, like edges of an object, or a person’s hair, eyes, or mouth. These are places where discrete tokens can make mistakes,” says Tang.

Because the diffusion model only predicts the remaining details after the autoregressive model has done its job, it can accomplish the task in eight steps, instead of the usual 30 or more a standard diffusion model requires to generate an entire image. This minimal overhead of the additional diffusion model allows HART to retain the speed advantage of the autoregressive model while significantly enhancing its ability to generate intricate image details.

“The diffusion model has an easier job to do, which leads to more efficiency,” he adds.

Outperforming larger models

During the development of HART, the researchers encountered challenges in effectively integrating the diffusion model to enhance the autoregressive model. They found that incorporating the diffusion model in the early stages of the autoregressive process resulted in an accumulation of errors. Instead, their final design of applying the diffusion model to predict only residual tokens as the final step significantly improved generation quality.

Their method, which uses a combination of an autoregressive transformer model with 700 million parameters and a lightweight diffusion model with 37 million parameters, can generate images of the same quality as those created by a diffusion model with 2 billion parameters, but it does so about nine times faster. It uses about 31 percent less computation than state-of-the-art models.

Moreover, because HART uses an autoregressive model to do the bulk of the work — the same type of model that powers LLMs — it is more compatible for integration with the new class of unified vision-language generative models. In the future, one could interact with a unified vision-language generative model, perhaps by asking it to show the intermediate steps required to assemble a piece of furniture.

“LLMs are a good interface for all sorts of models, like multimodal models and models that can reason. This is a way to push the intelligence to a new frontier. An efficient image-generation model would unlock a lot of possibilities,” he says.

In the future, the researchers want to go down this path and build vision-language models on top of the HART architecture. Since HART is scalable and generalizable to multiple modalities, they also want to apply it for video generation and audio prediction tasks.

This research was funded, in part, by the MIT-IBM Watson AI Lab, the MIT and Amazon Science Hub, the MIT AI Hardware Program, and the U.S. National Science Foundation. The GPU infrastructure for training this model was donated by NVIDIA. 

SeaPerch: A robot with a mission

Thu, 03/20/2025 - 3:40pm

The SeaPerch underwater robot is a popular educational tool for students in grades 5 to 12.  Building and piloting SeaPerch, a remotely operated vehicle (ROV), involves a variety of hand fabrication processes, electronics techniques, and STEM concepts. Through the SeaPerch program, educators and students explore structures, electronics, and underwater dynamics.  

“SeaPerch has had a tremendous impact on the fields of ocean science and engineering,” says Andrew Bennett ’85, PhD ’97, MIT SeaGrant education administrator and senior lecturer in the Department of Mechanical Engineering (MechE).

The original SeaPerch project was launched by MIT Sea Grant in 2003. In the decades that followed, it quickly spread across the country and overseas, creating a vibrant community of builders. Now under the leadership of RoboNation, SeaPerch continues to thrive with competitions around the world. These competitions introduce challenging real-world problems to foster creative solutions. Some recent topics have included deep sea mining and collecting data on hydrothermal vents.

SeaPerch II, which has been in development at MIT Sea Grant since 2021, builds on the original program by adding robotics and elements of marine and climate science. It remains a “do-it-yourself” maker project with objectives that are achievable by middle and high school students. Bennett says he hopes SeaPerch II will enable an even greater impact by providing an approachable path to learning more about sensors, robotics, climate science, and more.

“What I think is most valuable about it is that it uses hardware store components that need to be cut, waterproofed, connected, soldered, or somehow processed before becoming part of the robot or controller,” says Diane Brancazio ME ’90, K-12 maker team leader for the MIT Edgerton Center, who co-leads the MIT SeaPerch initiative with Bennett. “[It’s] kind of like making a cake from scratch, instead of from a mix — you see what goes into the final product and how it all comes together.”

SeaPerch II is a family of modules that allow students and educators to create educational adventures tailored to their particular wants or requirements. Offerings include a pressure and temperature sensing module that can be used on its own; an autonomy module that the students can use to construct a closed-loop automatic depth control system for their SeaPerch; and a lesson module for soft robotic “fingers” that can be configured into grippers, distance sensors, and bump sensors.

The basic SeaPerch is a PVC pipe structure with three motors and a tether to a switch box. Through the building process, students learn about buoyancy, structural design, hand fabrication, and electric circuits. SeaPerch II leverages technologies that are more advanced, less expensive, and more accessible than they were when SeaPerch was first conceived. Bennett says SeaPerch II is meant to extend the original SeaPerch program without invalidating any of the existing system.

Teagan Sullivan, a third-year student in mechanical engineering, first became involved with the project in January 2023 through an Undergraduate Research Opportunities Program project with MIT Sea Grant. Initially, she continued development of the soft robotics portion of the project, before switching to a more general focus where she worked on frame design for SeaPerch II, making sure components could fit and that stability could be maintained. Later she helped run outreach programs, taking feedback from the students she worked with to help modify designs and make them “more robust and kid-friendly.”

“I have been able to see the impact of SeaPerch II on a small scale by working directly with students,” Sullivan says. “I have seen how it encourages creativity, and how it has taught kids that collaboration is the best road to success. SeaPerch II teaches the basics of electronics, coding, and manufacturing, but its best strength is the ability to challenge the way people think and encourage critical thinking.”

The team’s vision is to create opportunities for young people to engage in authentic science investigations and engineering challenges, developing a passion for engineering, science, and the aquatic environment. MIT Sea Grant is continuing to develop new SeaPerch II modules, including incorporating land-water communication, salinity and dissolved oxygen sensors, and fluorometers.

Sullivan says she hopes the program will reach more students and inspire them to take an interest in engineering while teaching the skills they need to be the next generation of problem-solvers. Brancazio says she hopes this project inspires and prepares young people to work on climate change issues.

“Robots are supposed to help people do things they couldn’t otherwise do,” Brancazio says. “SeaPerch is a robot with a mission.”

Professor Emeritus Lee Grodzins, pioneer in nuclear physics, dies at 98

Thu, 03/20/2025 - 3:00pm

Nuclear physicist and MIT Professor Emeritus Lee Grodzins died on March 6 at his home in the Maplewood Senior Living Community at Weston, Massachusetts. He was 98.   

Grodzins was a pioneer in nuclear physics research. He was perhaps best known for the highly influential experiment determining the helicity of the neutrino, which led to a key understanding of what's known as the weak interaction. He was also the founder of Niton Corp. and the nonprofit Cornerstones of Science, and was a co-founder of the Union of Concerned Scientists.

He retired in 1999 after serving as an MIT physics faculty member for 40 years. As a member of the Laboratory for Nuclear Science (LNS), he initiated the relativistic heavy-ion physics program. He published over 170 scientific papers and held 64 U.S. patents.

“Lee was a very good experimental physicist, especially with his hands making gadgets,” says Heavy Ion Group and Francis L. Friedman Professor Emeritus Wit Busza PhD ’64. “His enthusiasm for physics spilled into his enthusiasm for how physics was taught in our department.”

Industrious son of immigrants

Grodzins was born July 10, 1926, in Lowell, Massachusetts, the middle child of Eastern European Jewish immigrants David and Taube Grodzins. He grew up in Manchester, New Hampshire. His two sisters were Ethel Grodzins Romm, journalist, author, and businesswoman who later ran his company, Niton Corp.; and Anne Lipow, who became a librarian and library science expert.

His father, who ran a gas station and a used-tire business, died when Lee was 15. To help support his family, Lee sold newspapers, a business he grew into the second-largest newspaper distributor in Manchester.

At 17, Grodzins attended the University of New Hampshire, graduating in less than three years with a degree in mechanical engineering.  However, he decided to be a physicist after disagreeing with a textbook that used the word “never.”

“I was pretty good in math and was undecided about my future,” Grodzins said in a 1958 New York Daily News article. “It wasn’t until my senior year that I unexpectedly realized I wanted to be a physicist. I was reading a physics text one day when suddenly this sentence hit me: ‘We will never be able to see the atom.’ I said to myself that that was as stupid a statement as I’d ever read. What did he mean ‘never!’ I got so annoyed that I started devouring other writers to see what they had to say and all at once I found myself in the midst of modern physics.”

He wrote his senior thesis on “Atomic Theory.”

After graduating in 1946, he approached potential employers by saying, “I have a degree in mechanical engineering, but I don’t want to be one. I’d like to be a physicist, and I’ll take anything in that line at whatever you will pay me.”

He accepted an offer from General Electric’s Research Laboratory in Schenectady, New York, where he worked in fundamental nuclear research building cosmic ray detectors, while also pursuing his master’s degree at Union College. “I had a ball,” he recalled. “I stayed in the lab 12 hours a day. They had to kick me out at night.”

Brookhaven

After earning his PhD from Purdue University in 1954, he spent a year as a lecturer there, before becoming a researcher at Brookhaven National Laboratory (BNL) with Maurice Goldhaber’s nuclear physics group, probing the properties of the nuclei of atoms.

In 1957, he, with Goldhaber and Andy Sunyar, used a simple table-top experiment to measure the helicity of the neutrino. Helicity characterizes the alignment of a particle’s intrinsic spin vector with that particle’s direction of motion. 

The research provided new support for the idea that the principle of conservation of parity — which had been accepted for 30 years as a basic law of nature before being disproven the year before, leading to the 1957 Nobel Prize in Physics — was not as inviolable as the scientists thought it was, and did not apply to the behavior of some subatomic particles.

The experiment took about 10 days to complete, followed by a month of checks and rechecks. They submitted a letter on “Helicity of Neutrinos” to Physical Review on Dec. 11, 1957, and a week later, Goldhaber told a Stanford University audience that the neutrino is left-handed, meaning that the weak interaction was probably one force. This work proved crucial to our understanding of the weak interaction, the force that governs nuclear beta decay.

“It was a real upheaval in our understanding of physics,” says Grodzins’ longtime colleague Stephen Steadman. The breakthrough was commemorated in 2008, with a conference at BNL on “Neutrino Helicity at 50.” 

Steadman also recalls Grodzins’ story about one night at Brookhaven, where he was working on an experiment that involved a radioactive source inside a chamber. Lee noticed that a vacuum pump wasn’t working, so he tinkered with it a while before heading home. Later that night, he gets a call from the lab. “They said, ‘Don't go anywhere!’” recalls Steadman. It turns out the radiation source in the lab had exploded, and the pump filled the lab with radiation. “They were actually able to trace his radioactive footprints from the lab to his home,” says Steadman. “He kind of shrugged it off.”

The MIT years       

Grodzins joined the faculty of MIT in 1959, where he taught physics for four decades. He inherited Robley Evans’ Radiation Laboratory, which used radioactive sources to study properties of nuclei, and led the Relativistic Heavy Ion Group, which was affiliated with the LNS.

In 1972, he launched a program at BNL using the then-new Tandem Van de Graaff accelerator to study interactions of heavy ions with nuclei. “As the BNL tandem was getting commissioned, we started a program, together with Doug Cline at the University of Rochester, tandem to investigate Coulomb-nuclear interference,” says Steadman, a senior research scientist at LNS. “The experimental results were decisive but somewhat controversial at the time. We clearly detected the interference effect.” The experimental work was published in Physical Review Letters.

Grodzins’ team looked for super-heavy elements using the Lawrence Berkeley National Laboratory Super-Hilac, investigated heavy-ion fission and other heavy-ion reactions, and explored heavy-ion transfer reactions. The latter research showed with precise detail the underlying statistical behavior of the transfer of nucleons between the heavy-ion projectile and target, using a theoretical statistical model of Surprisal Analysis developed by Rafi Levine and his graduate student. Recalls Steadman, “these results were both outstanding in their precision and initially controversial in interpretation.”

In 1985, he carried out the first computer axial tomographic experiment using synchrotron radiation, and in 1987, his group was involved in the first run of Experiment 802, a collaborative experiment with about 50 scientists from around the world that studied relativistic heavy ion collisions at Brookhaven. The MIT responsibility was to build the drift chambers and design the bending magnet for the experiment.

“He made significant contributions to the initial design and construction phases, where his broad expertise and knowledge of small area companies with unique capabilities was invaluable,” says George Stephans, physics senior lecturer and senior research scientist at MIT.

Professor emeritus of physics Rainer Weiss ’55, PhD ’62 recalls working on a Mossbauer experiment to establish if photons changed frequency as they traveled through bright regions. “It was an idea held by some to explain the ‘apparent’ red shift with distance in our universe,” says Weiss. “We became great friends in the process, and of course, amateur cosmologists.”

“Lee was great for developing good ideas,” Steadman says. “He would get started on one idea, but then get distracted with another great idea. So, it was essential that the team would carry these experiments to their conclusion: they would get the papers published.”

MIT mentor

Before retiring in 1999, Lee supervised 21 doctoral dissertations and was an early proponent of women graduate students in physics. He also oversaw the undergraduate thesis of Sidney Altman, who decades later won the Nobel Prize in Chemistry. For many years, he helped teach the Junior Lab required of all undergraduate physics majors. He got his favorite student evaluation, however, for a different course, billed as offering a “superficial overview” of nuclear physics. The comment read: “This physics course was not superficial enough for me.”

“He really liked to work with students,” says Steadman. “They could always go into his office anytime. He was a very supportive mentor.”

“He was a wonderful mentor, avuncular and supportive of all of us,” agrees Karl van Bibber ’72, PhD ’76, now at the University of California at Berkeley. He recalls handing his first paper to Grodzins for comments. “I was sitting at my desk expecting a pat on the head. Quite to the contrary, he scowled, threw the manuscript on my desk and scolded, ‘Don't even pick up a pencil again until you've read a Hemingway novel!’ … The next version of the paper had an average sentence length of about six words; we submitted it, and it was immediately accepted by Physical Review Letters.”

Van Bibber has since taught the “Grodzins Method” in his graduate seminars on professional orientation for scientists and engineers, including passing around a few anthologies of Hemingway short stories. “I gave a copy of one of the dog-eared anthologies to Lee at his 90th birthday lecture, which elicited tears of laughter.”

Early in George Stephans’ MIT career as a research scientist, he worked with Grodzins’ newly formed Relativistic Heavy Ion Group. “Despite his wide range of interests, he paid close attention to what was going on and was always very supportive of us, especially the students. He was a very encouraging and helpful mentor to me, as well as being always pleasant and engaging to work with. He actively pushed to get me promoted to principal research scientist relatively early, in recognition of my contributions.”

“He always seemed to know a lot about everything, but never acted condescending,” says Stephans. “He seemed happiest when he was deeply engaged digging into the nitty-gritty details of whatever unique and unusual work one of these companies was doing for us.”

Al Lazzarini ’74, PhD ’78 recalls Grodzins’ investigations using proton-induced X-ray emission (PIXE) as a sensitive tool to measure trace elemental amounts. “Lee was a superb physicist,” says Lazzarini. “He gave an enthralling seminar on an investigation he had carried out on a lock of Napoleon’s hair, looking for evidence of arsenic poisoning.”

Robert Ledoux ’78, PhD ’81, a former professor of physics at MIT who is now program director of the U.S. Advanced Research Projects Agency with the Department of Energy, worked with Grodzins as both a student and colleague. “He was a ‘nuclear physicist’s physicist’ — a superb experimentalist who truly loved building and performing experiments in many areas of nuclear physics. His passion for discovery was matched only by his generosity in sharing knowledge.”

The research funding crisis starting in 1969 led Grodzins to become concerned that his graduate students would not find careers in the field. He helped form the Economic Concerns Committee of the American Physical Society, for which he produced a major report on the “Manpower Crisis in Physics” (1971), and presented his results before the American Association for the Advancement of Science, and at the Karlsruhe National Lab in Germany.   

Grodzins played a significant role in bringing the first Chinese graduate students to MIT in the 1970s and 1980s.

One of the students he welcomed was Huan Huang PhD ’90. “I am forever grateful to him for changing my trajectory,” says Huang, now at the University of California at Los Angeles. “His unwavering support and ‘go do it’ attitude inspired us to explore physics at the beginning of a new research field of high energy heavy ion collisions in the 1980s. I have been trying to be a ‘nice professor’ like Lee all my academic career.”

Even after he left MIT, Grodzins remained available for his former students. “Many tell me how much my lifestyle has influenced them, which is gratifying,” Huang says. “They’ve been a central part of my life. My biography would be grossly incomplete without them.”

Niton Corp. and post-MIT work

Grodzins liked what he called “tabletop experiments,” like the one used in his 1957 neutrino experiment, which involved a few people building a device that could fit on a tabletop. “He didn’t enjoy working in large collaborations, which nuclear physics embraced.” says Steadman. “I think that’s why he ultimately left MIT.”

In the 1980s, he launched what amounted to a new career in detection technology. In 1987, after developing a scanning proton-induced X-ray microspectrometer for use measuring elemental concentrations in air, he founded the Niton Corp., which developed, manufactured, and marketed test kits and instruments to measure radon gas in buildings, lead-based paint detection, and other nondestructive testing applications. (“Niton” is an obsolete term for radon.)

“At the time, there was a big scare about radon in New England, and he thought he could develop a radon detector that was inexpensive and easy to use,” says Steadman. “His radon detector became a big business.”

He later developed devices to detect explosives, drugs, and other contraband in luggage and cargo containers. Handheld devices used X-ray fluorescence to determine the composition of metal alloys and to detect other materials. The handheld XL Spectrum Analyzer could detect buried and surface lead on painted surfaces, to protect children living in older homes. Three Niton X-ray fluorescence analyzers earned R&D 100 awards.

“Lee was very technically gifted,” says Steadman.

In 1999, Grodzins retired from MIT and devoted his energies to industry, including directing the R&D group at Niton.

His sister Ethel Grodzins Romm was the president and CEO of Niton, followed by his son Hal. Many of Niton’s employees were MIT graduates. In 2005, he and his family sold Niton to Thermo Fisher Scientific, where Lee remained as a principal scientist until 2010.

In the 1990s, he was vice president of American Science and Engineering, and between the ages of 70 and 90, he was awarded three patents a year. 

“Curiosity and creativity don’t stop after a certain age,” Grodzins said to UNH Today. “You decide you know certain things, and you don’t want to change that thinking. But thinking outside the box really means thinking outside your box.”

“I miss his enthusiasm,” says Steadman. “I saw him about a couple of years ago and he was still on the move, always ready to launch a new effort, and he was always trying to pull you into those efforts.”

A better world

In the 1950s, Grodzins and other Brookhaven scientists joined the American delegation at the Second United Nations International Conference on the Peaceful Uses of Atomic Energy in Geneva.

Early on, he joined several Manhattan Project alums at MIT in their concern about the consequences of nuclear bombs. In Vietnam-era 1969, Grodzins co-founded the Union of Concerned Scientists, which calls for scientific research to be directed away from military technologies and toward solving pressing environmental and social problems. He served as its chair in 1970 and 1972. He also chaired committees for the American Physical Society and the National Research Council.

As vice president for advanced products at American Science and Engineering, which made homeland security equipment, he became a consultant on airport security, especially following the 9/11 attacks. As an expert witness, he testified at the celebrated trial to determine whether Pan Am was negligent for the bombing of Flight 103 over Lockerbie, Scotland, and he took part in a weapons inspection trip on the Black Sea. He also was frequently called as an expert witness on patent cases.

In 1999, Grodzins founded the nonprofit Cornerstones in Science, a public library initiative to improve public engagement with science. Based originally at the Curtis Memorial Library in Brunswick, Maine, Cornerstones now partners with libraries in Maine, Arizona, Texas, Massachusetts, North Carolina, and California. Among their initiatives was one that has helped supply telescopes to libraries and astronomy clubs around the country.

“He had a strong sense of wanting to do good for mankind,” says Steadman.

Awards

Grodzins authored more than 170 technical papers and holds more than 60 U.S. patents. His numerous accolades included being named a Guggenheim Fellow in 1964 and 1971, and a senior von Humboldt fellow in 1980. He was a fellow of the American Physical Society and the American Academy of Arts and Sciences, and received an honorary doctor of science degree from Purdue University in 1998.

In 2021, the Denver X-Ray Conference gave Grodzins the Birks Award in X-Florescence Spectrometry, for having introduced “a handheld XRF unit which expanded analysis to in-field applications such as environmental studies, archeological exploration, mining, and more.”

Personal life

One evening in 1955, shortly after starting his work at Brookhaven, Grodzins decided to take a walk and explore the BNL campus. He found just one building that had lights on and was open, so he went in. Inside, a group was rehearsing a play. He was immediately smitten with one of the actors, Lulu Anderson, a young biologist. “I joined the acting company, and a year-and-a-half later, Lulu and I were married,” Grodzins had recalled. They were happily married for 62 years, until Lulu’s death in 2019.

They raised two sons, Dean, now of Cambridge, Massachusetts, and Hal Grodzins, who lives in Maitland, Florida. Lee and Lulu owned a succession of beloved huskies, most of them named after physicists.

After living in Arlington, Massachusetts, the Grodzins family moved to Lexington, Massachusetts, in 1972 and bought a second home a few years later in Brunswick, Maine. Starting around 1990, Lee and Lulu spent every weekend, year-round, in Brunswick. In both places, they were avid supporters of their local libraries, museums, theaters, symphonies, botanical gardens, public radio, and TV stations.

Grodzins took his family along to conferences, fellowships, and other invitations. They all lived in Denmark for two sabbaticals, in 1964-65 and 1971-72, while Lee worked at the Neils Bohr Institute. They also traveled together to China for a month in 1975, and for two months in 1980. As part of the latter trip, they were among the first American visitors to Tibet since the 1940s. Lee and Lulu also traveled the world, from Antarctica to the Galapagos Islands to Greece.

His homes had basement workshops well-stocked with tools. His sons enjoyed a playroom he built for them in their Arlington home. He also once constructed his own high-fidelity record player, patched his old Volvo with fiberglass, changed his own oil, and put on the winter tires and chains himself. He was an early adopter of the home computer.

“His work in science and technology was part of a general love of gadgets and of fixing and making things,” his son, Dean, wrote in a Facebook post.

Lee is survived by Dean, his wife, Nora Nykiel Grodzins, and their daughter, Lily; and by Hal and his wife Cathy Salmons. 

A remembrance and celebration for Lee Grodzins is planned for this summer. Donations in his name may be made to Cornerstones of Science.

Drawing inspiration from ancient chemical reactions

Thu, 03/20/2025 - 12:00am

To help find solutions to the planet’s climate crisis, MIT Associate Professor Daniel Suess is looking to Earth’s ancient past.

Early in the evolution of life, cells gained the ability to perform reactions such as transferring electrons from one atom to another. These reactions, which help cells to build carbon-containing or nitrogen-containing compounds, rely on specialized enzymes with clusters of metal atoms.

By learning more about how those enzymes work, Suess hopes to eventually devise new ways to perform fundamental chemical reactions that could help capture carbon from the atmosphere or enable the development of alternative fuels.

“We have to find some way of rewiring society so that we are not just relying on vast reserves of reduced carbon, fossil fuels, and burning them using oxygen,” he says. “What we’re doing is we’re looking backward, up to a billion years before oxygen and photosynthesis came along, to see if we can identify the chemical principles that underlie processes that aren’t reliant on burning carbon.”

His work could also shed light on other important cellular reactions such as the conversion of nitrogen gas to ammonia, which is also the key step in the production of synthetic fertilizer.

Exploring chemistry

Suess, who grew up in Spokane, Washington, became interested in math at a young age, but ended up majoring in chemistry and English at Williams College, which he chose based on its appealing selection of courses.

“I was interested in schools that were more focused on the liberal arts model, Williams being one of those. And I just thought they had the right combination of really interesting courses and freedom to take classes that you wanted,” he says. “I went in not expecting to major in chemistry, but then I really enjoyed my chemistry classes and chemistry teachers.”

In his classes, he explored all aspects of chemistry and found them all appealing.

“I liked organic chemistry, because there’s an emphasis on making things. And I liked physical chemistry because there was an attempt to have at least a semiquantitative way of understanding the world. Physical chemistry describes some of the most important developments in science in the 20th century, including quantum mechanics and its application to atoms and molecules,” he says.

After college, Suess came to MIT for graduate school and began working with chemistry professor Jonas Peters, who had recently arrived from Caltech. A couple of years later, Peters ended up moving back to Caltech, and Suess followed, continuing his PhD thesis research on new ways to synthesize inorganic molecules.

His project focused on molecules that consist of a metal such as iron or cobalt bound to a nonmetallic group known as a ligand. Within these molecules, the metal atom typically pulls in electrons from the ligand. However, the molecules Suess worked on were designed so that the metal would give up its own electrons to the ligand. Such molecules can be used to speed up difficult reactions that require breaking very strong bonds, like the nitrogen-nitrogen triple bond in N2.

During a postdoc at the University of California at Davis, Suess switched gears and began working on biomolecules — specifically, metalloproteins. These are protein enzymes that have metals tucked into their active sites, where they help to catalyze reactions.

Suess studied how cells synthesize the metal-containing active sites in these proteins, focusing on an enzyme called iron-iron hydrogenase. This enzyme, found mainly in anaerobic bacteria, including some that live in the human digestive tract, catalyzes reactions involving the transfer of protons and electrons. Specifically, it can combine two protons and two electrons to make H2, or can perform the reverse reaction, breaking H2 into protons and electrons.

“That enzyme is really important because a lot of cellular metabolic processes either generate excess electrons or require excess electrons. If you generate excess electrons, they have to go somewhere, and one solution is to put them on protons to make H2,” Suess says.

Global scale reactions

Since joining the MIT faculty in 2017, Suess has continued his investigations of metalloproteins and the reactions that they catalyze.

“We’re interested in global-scale chemical reactions, meaning they’re occurring on the microscopic scale but happening on a huge scale,” he says. “They impact the planet and have determined what the molecular composition of the biosphere is and what it’s going to be.”

Photosynthesis, which emerged around 2.4 billion years ago, has had the biggest impact on the atmosphere, filling it with oxygen, but Suess focuses on reactions that cells began using even earlier, when the atmosphere lacked oxygen and cell metabolism could not be driven by respiration.

Many of these ancient reactions, which are still used by cells today, involve a class of metalloproteins called iron-sulfur proteins. These enzymes, which are found in all kingdoms of life, are involved in catalyzing many of the most difficult reactions that occur in cells, such as forming carbon radicals and converting nitrogen to ammonia.

To study the metalloenzymes that catalyze these reactions, Suess’s lab takes two different approaches. In one, they create synthetic versions of the proteins that may contain fewer metal atoms, which allows for greater control over the composition and shape of the protein, making them easier to study.

In another approach, they use the natural version of the protein but substitute one of the metal atoms with an isotope that makes it easier to use spectroscopic techniques to analyze the protein’s structure.

“That allows us to study both the bonding in the resting state of an enzyme, as well as the bonding and structures of reaction intermediates that you can only characterize spectroscopically,” Suess says.

Understanding how enzymes perform these reactions could help researchers find new ways to remove carbon dioxide from the atmosphere by combining it with other molecules to create larger compounds. Finding alternative ways to convert nitrogen gas to ammonia could also have a big impact on greenhouse gas emissions, as the Haber Bosch process now used to synthesize fertilizer produces requires huge amounts of energy.

“Our primary focus is on understanding the natural world, but I think that as we’re looking at different ways to wire biological catalysts to do efficient reactions that impact society, we need to know how that wiring works. And so that is what we’re trying to figure out,” he says.

At the core of problem-solving

Wed, 03/19/2025 - 4:40pm

As director of the MIT BioMicro Center (BMC), Stuart Levine ’97 wholeheartedly embraces the variety of challenges he tackles each day. One of over 50 core facilities providing shared resources across the Institute, the BMC supplies integrated high-throughput genomics, single-cell and spatial transcriptomic analysis, bioinformatics support, and data management to researchers across MIT.

“Every day is a different day,” Levine says, “there are always new problems, new challenges, and the technology is continuing to move at an incredible pace.” After more than 15 years in the role, Levine is grateful that the breadth of his work allows him to seek solutions for so many scientific problems.

By combining bioinformatics expertise with biotech relationships and a focus on maximizing the impact of the center’s work, Levine brings the broad range of skills required to match the diversity of questions asked by researchers in MIT’s Department of Biology.

Expansive expertise

Biology first appealed to Levine as an MIT undergraduate taking class 7.012 (Introduction to Biology), thanks to the charisma of instructors Professor Eric Lander and Amgen Professor Emerita Nancy Hopkins. After earning his PhD in biochemistry from Harvard University and Massachusetts General Hospital, Levine returned to MIT for postdoctoral work with Professor Richard Young, core member at the Whitehead Institute for Biomedical Research.

In the Young Lab, Levine found his calling as an informaticist and ultimately decided to stay at MIT. Here, his work has a wide-ranging impact: the BMC serves over 100 labs annually, from the the Computer Science and Artificial Intelligence Laboratory and the departments of Brain and Cognitive Sciences; Earth, Atmospheric and Planetary Sciences; Chemical Engineering; Mechanical Engineering; and, of course, Biology.

“It’s a fun way to think about science,” Levine says, noting that he applies his knowledge and streamlines workflows across these many disciplines by “truly and deeply understanding the instrumentation complexities.”

This depth of understanding and experience allows Levine to lead what longtime colleague Professor Laurie Boyer describes as “a state-of-the-art core that has served so many faculty and provides key training opportunities for all.” He and his team work with cutting-edge, finely tuned scientific instruments that generate vast amounts of bioinformatics data, then use powerful computational tools to store, organize, and visualize the data collected, contributing to research on topics ranging from host-parasite interactions to proposed tools for NASA’s planetary protection policy.

Staying ahead of the curve

With a scientist directing the core, the BMC aims to enable researchers to “take the best advantage of systems biology methods,” says Levine. These methods use advanced research technologies to do things like prepare large sets of DNA and RNA for sequencing, read DNA and RNA sequences from single cells, and localize gene expression to specific tissues.

Levine presents a lightweight, clear rectangle about the width of a cell phone and the length of a VHS cassette.

“This is a flow cell that can do 20 human genomes to clinical significance in two days — 8 billion reads,” he says. “There are newer instruments with several times that capacity available as well.”

The vast majority of research labs do not need that kind of power, but the Institute, and its researchers as a whole, certainly do. Levine emphasizes that “the ROI [return on investment] for supporting shared resources is extremely high because whatever support we receive impacts not just one lab, but all of the labs we support. Keeping MIT’s shared resources at the bleeding edge of science is critical to our ability to make a difference in the world.”

To stay at the edge of research technology, Levine maintains company relationships, while his scientific understanding allows him to educate researchers on what is possible in the space of modern systems biology. Altogether, these attributes enable Levine to help his researcher clients “push the limits of what is achievable.”

The man behind the machines

Each core facility operates like a small business, offering specialized services to a diverse client base across academic and industry research, according to Amy Keating, Jay A. Stein (1968) Professor of Biology and head of the Department of Biology. She explains that “the PhD-level education and scientific and technological expertise of MIT’s core directors are critical to the success of life science research at MIT and beyond.” 

While Levine clearly has the education and expertise, the success of the BMC “business” is also in part due to his tenacity and focus on results for the core’s users.

He was recognized by the Institute with the MIT Infinite Mile Award in 2015 and the MIT Excellence Award in 2017, for which one nominator wrote, “What makes Stuart’s leadership of the BMC truly invaluable to the MIT community is his unwavering dedication to producing high-quality data and his steadfast persistence in tackling any type of troubleshooting needed for a project. These attributes, fostered by Stuart, permeate the entire culture of the BMC.”      

“He puts researchers and their research first, whether providing education, technical services, general tech support, or networking to collaborators outside of MIT,” says Noelani Kamelamela, lab manager of the BMC. “It’s all in service to users and their projects.”

Tucked into the far back corner of the BMC lab space, Levine’s office is a fitting symbol of his humility. While his guidance and knowledge sit at the center of what elevates the BMC beyond technical support, he himself sits away from the spotlight, resolutely supporting others to advance science.

“Stuart has always been the person, often behind the scenes, that pushes great science, ideas, and people forward,” Boyer says. “His knowledge and advice have truly allowed us to be at the leading edge in our work.”

A software platform streamlines emergency response

Wed, 03/19/2025 - 4:30pm

Wildfires set acres ablaze. Earthquakes decimate towns into rubble. People go missing in mountains and bodies of water. Coronavirus cases surge globally.

When disaster strikes, timely, cohesive emergency response is crucial to saving lives, reducing property and resource loss, and protecting the environment. Large-scale incidents can call into action thousands of first responders from multiple jurisdictions and agencies, national and international. To effectively manage response, relief, and recovery efforts, they must work together to collect, process, and distribute accurate information from disparate systems. This lack of interoperability can hinder coordination and ultimately result in significant failures in disaster response.

MIT Lincoln Laboratory developed the Next-Generation Incident Command System (NICS) to enable first responders across different jurisdictions, agencies, and countries to effectively coordinate during emergencies of any scale. Originally intended to help U.S. firefighters respond to wildfires, NICS has since evolved from an R&D prototype into an open-source operational platform adopted by emergency-response agencies worldwide, not only for natural disaster response but also search-and-rescue operations, health crises management, public event security, and aviation safety. The global community of users cultivated by NICS, and spinouts inspired by NICS, have maximized its impact.

At the core of the web-based NICS software tool is an incident map overlaying aggregated data from various external and internal sources such as first responders on the ground, airborne imaging sensors, weather and traffic reports, census data, and satellite-based maps; virtually any data source can be added. Emergency personnel upload the content directly on a computer or mobile app and communicate in real time through voice and chat functions. Role-based collaboration rooms are available for user-defined subsets of first responders to focus on a particular activity — such as air drop support, search and rescue, and wildlife rescue — while maintaining access to the comprehensive operational picture.

With its open-standards architecture, NICS interoperates with organizations' existing systems and allows internal data to be shared externally for enhanced visibility and awareness among users as a disaster unfolds. The modular design of NICS facilitates system customization for diverse user needs and changing mission requirements. The system archives all aspects of a created incident and can generate reports for post-incident analysis to inform future response planning. 

Partnering with first responders

As a federally funded research and development (R&D) center, Lincoln Laboratory has a long history of conducting R&D of architectures for information sharing, situational awareness, and decision-making in support of the U.S. Department of Defense and other federal entities. Recognizing that aspects of these architectures are relevant to disaster response, Lincoln Laboratory's Technology Office initiated in 2007 a study focused on wildfire response in California. A laboratory-led research team partnered with the California Department of Forestry and Fire Protection (CAL FIRE), which annually responds to thousands of wildfires in collaboration with police, medical, and other services.

"CAL FIRE provided firsthand insight into what information is critical during emergency response and how may be best to view and share this information," says NICS co-developer Gregory Hogan, now associate leader of the laboratory's Advanced Sensors and Techniques Group.

With this insight, the laboratory developed and demonstrated a prototype of NICS. Noting the utility of such a system, the U.S. Department of Homeland Security (DHS) Science and Technology Directorate (S&T) began funding the R&D of NICS in 2010. Over the next several years, the laboratory team refined NICS, soliciting input from an organically formed users' group comprising more than 450 organizations across fire, law, medical, emergency services and management, border patrol, industry, utilities, nongovernmental organizations, and tribal partners. Thousands of training exercises and real emergencies employed NICS to coordinate diverse emergency-response activities spanning disaster management, law enforcement, and special security.

In 2014, CAL FIRE — which had been using NICS to respond to wildfires, mudslides and floods — officially adopted NICS statewide. That same year, the Emergency Management Directorate of Victoria, Australia's largest state, implemented NICS (as the Victorian Information Network for Emergencies, or VINE) after a worldwide search for a system to manage large-scale crises like bush fires.

In 2015, NICS was transferred to the California Office of Emergency Services. The California Governor's Office of Emergency Services deployed NICS as the Situation Awareness and Collaboration Tool (SCOUT) for emergency responders and law enforcement officials statewide in 2016.

Creating an open-source community

NICS also spawned an initial spinout company formed by personnel from CAL FIRE, the Worldwide Incident Command Services (WICS), which received a license for the system's software code in early 2015. WICS is a California-incorporated nonprofit public benefit corporation and the official DHS S&T Technology Transition Partner created to transition the NICS R&D project to a robust operational platform, which was named Raven. Later that year, DHS S&T made NICS available worldwide at no cost to first responder and emergency management agencies through an open-source release of the software code base on Github.

Sponsorship of NICS by DHS S&T is ongoing, with contributions over the years from the U.S. Coast Guard (USCG) Research and Development Center and the NATO Science for Peace and Security (SPS) Program. In 2015, the USCG funded the development of the cross-platform mobile app Portable Handset Integrated NICS (PHINICS), which enables first responders to access NICS with or without cellular coverage.

In 2016, Lincoln Laboratory and DHS S&T launched a four-year partnership with the NATO SPS Program to extend NICS to Bosnia and Herzegovina (BiH), Croatia, North Macedonia, and Montenegro for enhanced emergency collaboration among and within these Western Balkan nations. Under this Advanced Regional Civil Emergency Coordination Pilot, NICS was demonstrated in dozens of field exercises and applied to real-life incidents, including wildfires in BiH and a 6.2-magnitude earthquake in Croatia. In 2019, North Macedonia adopted NICS as its official crisis management system. And, when Covid-19 struck, NICS entered a new application space: public health. In North Macedonia, emergency institutions used NICS to not only coordinate emergency response, but also inform residents about infection cases and health resource locations. The laboratory team worked with North Macedonia's Crisis Management Center to enable national public access to NICS. 

Increasing global impact

NICS' reach continues to grow. In 2021, the Massachusetts Department of Transportation Aeronautics Division and the U.S. Department of Transportation Volpe National Transportation Systems Center collaborated with Lincoln Laboratory using the baseline NICS system to field a new web-based tool: the Commonwealth aiRspace and Information Sharing Platform (CRISP). Integrating sensor feeds, airspace information, and resource data, CRISP enables a robust counter–small uncrewed aircraft systems mission for the safety and security of aviation and aviation-related activities throughout the Commonwealth of Massachusetts.

"The NICS project has demonstrated the power of collaborative development, in which each partner lends their expertise, resulting in a meaningful contribution to the global disaster response community," says Stephanie Foster, who was the lead developer and program manager of NICS.

In 2023, Foster co-founded the spinout company Generation NYX to increase access to NICS, renamed NYX DEFENDER, and create a community of users who work together to advance its capabilities. Generation NYX offers services to existing users established during the laboratory's R&D work, and provides a software-as-a-service solution for all new users. NYX DEFENDER improves the ability of local emergency management organizations to manage events such as parades and festivals; supports decision-making during floods and other natural disasters; and expands awareness among community stakeholders such as police, fire, and state officials. 

"NYX DEFENDER offers an innovative tool for local emergency management and public safety agencies and departments to create a common operating picture and foster interoperability, improve communications, and develop and maintain situational awareness during preplanned and no-notice events," says Clara Decerbo, director at the Providence Emergency Management Agency. "Our use of NYX DEFENDER during major City of Providence events has allowed us to integrate situational awareness between multiple public safety entities, private security, and event organizers and assisted us in ensuring our teams have the information they need to provide well-organized and coordinated public safety services to members of our community and visitors."

Generation NYX was recently subcontracted to provide support for a new three-year project that NATO SPS and DHS S&T kicked off earlier this year with the laboratory to establish NICS as the national disaster management platform in BiH. Foster has experience in this area, as she not only led the laboratory technical team who successfully adapted and deployed NICS in the Western Balkans under the 2016 SPS pilot, but also coordinated teams across the four nations. Though BiH participated in the 2016 SPS pilot, this latest effort seeks to expand NICS' adoption more broadly across the country, working within its complex multilevel government structure. NATO SPS is funding a second project, which began in October 2024, that will bring NICS to Albania and Georgia for use in search and rescue, particularly in response to chemical, biological, radiological, and nuclear events. For both projects, the laboratory team will enhance the open-source NICS code to operate on the edge (i.e., in disconnected communication scenarios) and integrate wearables for monitoring the health of first responders.

Since NICS was released open source on Github, NICS' worldwide usage has continued to grow for a wide range of applications. NICS has been used to locate missing persons in the Miljacka and Bosna Rivers in BiH; to direct ambulances to hypothermic runners at the Los Angeles Marathon; and to provide situational awareness among the National Guard for the Fourth of July celebration in Boston, Massachusetts. NICS has also proven its utility in mine and unexploded ordnance detection and clearance activities; in BiH, an estimated 80,000 explosive remnants of war pose a direct threat to the country's residents. Envisioned applications of NICS include monitoring of critical infrastructure such as utilities.

In recognition of its broader humanitarian impact, NICS was awarded a 2018 Excellence in Technology Transfer Award, Northeast Region, from the Federal Laboratory Consortium and a 2019 IEEE Innovation in Societal Infrastructure Award.

"NICS is a mature product, so what we are thinking about now is outside-the-box use cases for the technology," says the laboratory's Bioanalytics Systems and Technologies Group Leader Kajal Claypool, who is supervising the ongoing NATO SPS and DHS S&T projects. "That is where I see Lincoln Laboratory can bring innovation to bear."

Security scheme could protect sensitive data during cloud computation

Wed, 03/19/2025 - 12:00am

A hospital that wants to use a cloud computing service to perform artificial intelligence data analysis on sensitive patient records needs a guarantee those data will remain private during computation. Homomorphic encryption is a special type of security scheme that can provide this assurance.

The technique encrypts data in a way that anyone can perform computations without decrypting the data, preventing others from learning anything about underlying patient records. However, there are only a few ways to achieve homomorphic encryption, and they are so computationally intensive that it is often infeasible to deploy them in the real world.

MIT researchers have developed a new theoretical approach to building homomorphic encryption schemes that is simple and relies on computationally lightweight cryptographic tools. Their technique combines two tools so they become more powerful than either would be on its own. The researchers leverage this to construct a “somewhat homomorphic” encryption scheme — that is, it enables users to perform a limited number of operations on encrypted data without decrypting it, as opposed to fully homomorphic encryption that can allow more complex computations.

This somewhat homomorphic technique can capture many applications, such as private database lookups and private statistical analysis.

While this scheme is still theoretical, and much work remains before it could be used in practice, its simpler mathematical structure could make it efficient enough to protect user data in a wider range of real-world scenarios.

“The dream is that you type your ChatGPT prompt, encrypt it, send the encrypted message to ChatGPT, and then it can produce outputs for you without ever seeing what you are asking it,” says Henry Corrigan-Gibbs, the Douglas Ross Career Development Professor of Software Technology in the MIT Department of Electrical Engineering and Computer Science (EECS) and a co-author of a paper on this security scheme. “We are a long way from getting there, in part because these schemes are so inefficient. In this work, we wanted to try to build homomorphic encryption schemes that don’t use the standard tools, since different approaches can often lead to more efficient, more practical constructions.”

His co-authors include Alexandra Henzinger, an EECS graduate student; Yael Kalai, an Ellen Swallow Richards (1873) Professor and professor of EECS; and Vinod Vaikuntanathan, the Ford Professor of Engineering and a principal investigator in the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the International Conference on the Theory and Applications of Cryptographic Techniques.

Balancing security and flexibility

MIT researchers began theorizing about homomorphic encryption in the 1970s. But designing the mathematical structure needed to securely embed a message in a manner flexible enough to enable computation proved to be enormously challenging. The first homomorphic encryption scheme wasn’t designed until 2009.

“These two requirements are very much in tension. On the one hand, we need security, but on the other hand, we need this flexibility in the homomorphism. We have very few mathematical pathways to get there,” says Henzinger.

Essentially, homomorphic schemes add noise to a message to encrypt it. As algorithms and machine-learning models perform operations on that encrypted message, the noise inevitably grows. If one computes for too long, the noise can eventually overshadow the message.

“If you run a deep neural network on these encrypted data, for instance, by the time you get to the end of the computation, the noise might be a billion times larger than the message and you can’t actually figure out what the message says,” Corrigan-Gibbs explains.

There are two main ways to get around this problem. A user could keep operations to a minimum, but this restricts how the encrypted data can be used. On the other hand, a user could add extra steps to reduce noise, but these techniques require a massive amount of additional computation.

Somewhat homomorphic encryption seeks to meet users somewhere in the middle. They can use the technique to perform secure operations on encrypted data using a specific class of functions that keep the noise from growing out of hand.

These functions, known as bounded polynomials, are designed to prevent excessively complex operations. For instance, the functions allow many additions, but only a few multiplications on encrypted data to avoid generating too much extra noise.

Greater than the sum of their parts

The researchers built their scheme by combining two simple cryptographic tools. They started with a linear homomorphic encryption scheme, which can only perform additions on encrypted data, and added one theoretical assumption to it.

This cryptographic assumption “lifts” the linear scheme into a somewhat homomorphic one that can operate with a broader class of more complex functions.

“On its own, this assumption doesn’t give us much. But when we put them together, we get something much more powerful. Now, we can do additions and some bounded number of multiplications,” Henzinger says.

The process for performing homomorphic encryptions is quite simple. The researchers’ scheme encrypts each piece of data into a matrix in a way that the matrix provably hides the underlying data. Then, to perform additions or multiplications on those encrypted data, one only needs to add or multiply the corresponding matrices.

The researchers used mathematical proofs to show that their theoretical encryption scheme provides guaranteed security when operations are limited to this class of bounded polynomial functions.

Now that they have developed this theoretical approach, one next step will be making it practical for real-world applications. For that, they will need to make the encryption scheme fast enough to run certain types of computations on modern hardware.

“We haven’t spent 10 years trying to optimize this scheme yet, so we don’t know how efficient it could get,” Henzinger says.

In addition, the researchers hope to expand their technique to allow more complex operations, perhaps moving closer to developing a new approach for fully homomorphic encryption.

“The exciting thing for us is that, when we put these two simple things together, something different happened that we didn’t expect. It gives us hope. What else can we do now? If we add something else, maybe we can do something even more exciting,” Corrigan-Gibbs says.

This research was funded, in part, by Apple, Capital One, Facebook, Google, Mozilla, NASDAQ, MIT’s FinTech@CSAIL Initiative, the National Science Foundation (NSF), and a Simons Investigator Award.

David Schmittlein, influential dean who brought MIT Sloan into its own, dies at 69

Tue, 03/18/2025 - 8:00pm

David Schmittlein, an MIT professor of marketing and the MIT Sloan School of Management’s longest-serving dean and a visionary and transformational leader, died March 13, following a long illness. He was 69.

Schmittlein, the John C Head III Dean from 2007 to 2024, guided MIT Sloan through a financial crisis, a global pandemic, and numerous school-wide milestones. During those 17 years, Schmittlein led initiatives introducing several new degree programs, redesigning the academic program portfolio while maintaining the MBA as the flagship degree, and diversifying executive offerings. Under his guidance, the school enhanced alumni engagement, increased philanthropic support, expanded the faculty, oversaw numerous campus capital projects, and opened several international programs. He also championed a centennial celebration of Course 15 — MIT’s designation for management — and led a branding and marketing effort that cemented MIT Sloan’s reputation as a place for smart, open, grounded, and inventive leaders.

In all, he brought MIT Sloan’s value to managers, organizations, and the world into clear focus, positioning and preparing the school to lead in a new era of management education.

“Dave transformed the MIT Sloan School of Management from a niche player to a top five business school and, in the process, drew us closer to the Institute in ways that all of the faculty, staff, and students welcome and support,” says MIT professor of finance Andrew W. Lo. “He greatly expanded our visibility internationally [and] also expanded our footprint from a research and educational and outreach perspective. Really, it gave us the opportunity to define ourselves in ways that we weren’t doing prior to his joining.”

In a letter to the MIT community, President Sally Kornbluth wrote, “Dave helped build MIT Sloan’s reputation and impact around the globe, worked with faculty to create first-rate new management education programs, and substantially improved current students’ educational opportunities.”

Kornbluth, who was appointed MIT president in 2023, noted that she didn’t overlap with Schmittlein for very long before he stepped down in February 2024 due to his illness. But during that year, his “wise, funny, judicious counsel left a lasting impression,” Kornbluth wrote. “I knew I could always call on him as a sounding board and thought partner, and I did.”

Professor Georgia Perakis, who was appointed the John C Head III Dean (Interim) when Schmittlein left last year, says, “Dave was not only an incredible leader for MIT Sloan, but also a mentor, teacher, and friend. Under his leadership, he took MIT Sloan to new heights. I will always be grateful for his guidance and support during my time as interim dean. I know the legacy of his contributions to MIT and MIT Sloan will always stay with us.”

Before coming to MIT Sloan, Schmittlein was a professor of marketing and deputy dean at the Wharton School of the University of Pennsylvania, where he spent 27 years. Schmittlein, who grew up in Northampton, Massachusetts, viewed his appointment as the eighth dean of MIT Sloan as a homecoming in 2007.

From modest roots, and the oldest of six siblings, Schmittlein graduated from Brown University, where he earned a BA in mathematics, and Columbia University, where he was awarded both an MPhil in business and a PhD in marketing.

“Growing up in Massachusetts, MIT was always an icon for me,” Schmittlein later wrote.

“MIT picks an outsider to lead Sloan School”

As The Boston Globe headline announcing his arrival made clear, Schmittlein’s appointment as dean was unusual. He was the first to come from outside MIT since the school’s founding dean, E. Pennell Brooks, was appointed. But, in 2007, Institute leadership determined that there was a need for a fresh perspective at MIT Sloan.

“While most of Dean Schmittlein’s MIT predecessors had risen through the MIT faculty ranks, I directed the search committee to search broadly to identify a leader who could amplify the MIT Sloan School’s impact and extend its reach,” says President Emerita Susan Hockfield, who led MIT from 2004 to 2012. “David Schmittlein emerged with his unusual combination of cerebral and collaborative talents, along with his academic experience at the highest level.”

By the time Schmittlein arrived, the MIT Sloan School, which had its origins in 1914 as an undergraduate major called Engineering Administration, was at an exciting crossroads. Schmittlein’s predecessor, Richard Schmalensee, who had served as dean for nearly a decade, had secured donor funding for the construction of a new central building and established a concise mission statement that would guide the school in the coming decades. MIT’s management school was at a point of reflection and growth.

“I acknowledged head-on that I was coming from a very different school — not to change MIT, but to help it be the best version of its distinctive self,” Schmittlein wrote recently.

Schmittlein quickly identified several critical tasks. In 2007, the school had a group of 96 tenure-line faculty members, but they often left for peer schools, and the small faculty size meant that one person’s exit affected an entire department. There was no real mechanism for highlighting MIT Sloan expert faculty insights. The flagship MBA program was successful, but had challenges with selectivity and scale. And the comparatively small class size meant that the alumni community was challenged in networking, particularly in finance.

Financial crisis and MFin degree

Schmittlein collaborated with the school’s finance faculty to launch the Master of Finance degree program in 2008. Nobel laureate Robert C. Merton, who had begun his career at MIT Sloan but had decamped to Harvard University, returned to the school in 2010 to be involved in the one-year program. Today, the MFin program — known for its selectivity and rigor — offers a range of quantitative courses and features an 18-month option in addition to the original one-year curriculum.

Schmittlein’s arrival at MIT coincided with the global financial crisis of 2007–09. “The entire Institute was reeling from the meltdown,” Lo remembers. “We had to respond … and one of the most impressive things Dave did was to acknowledge the problems with the financial crisis and the financial system. But instead of de-emphasizing finance, he encouraged the finance group to do research on the crisis and to come up with a better version of finance that acknowledged these potential dangers.”

In turn, program enrollment increased, and “a number of our students ultimately went off to regulatory positions, as well as to industry, with a new knowledge of how to deal with financial crises more systematically,” Lo says.

Expansion of executive and other degree programs

In 2010, the long-standing full-time MIT Sloan Fellows MBA program attracted mid-career leaders and managers from around the world to MIT Sloan. That year, Schmittlein shepherded the launch of the 20-month part-time MIT Executive MBA program. This program opened up more opportunities for U.S.-based executives to earn a degree without having to leave their jobs for a full-time program.

Next, MIT Sloan launched the Master of Science in Management Studies program, which allowed graduates and current students from several international partner schools, including Fudan University and Tsinghua University in China, to earn a master’s degree from MIT in nine months.

Rounding out the portfolio of academic programs introduced during Schmittlein’s tenure is the MIT Sloan Master of Business Analytics program, launched in 2016. The program, which bridged MIT Sloan’s classes with MIT’s offerings in computer science, became one of the most competitive master’s degree programs at the Institute.

One distinction for MIT Sloan was “its integration with the university within which it lives,” Schmittlein said in a 2008 interview. “We are different from other schools in that regard. Most other leading schools of management wall off their teaching programs and their research programs from the rest of the university. We simply don’t do that.”

“MIT Sloan in 2025 is very much ‘the house that Dave built,’” says Professor Ezra W. Zuckerman Sivan.

“This is nothing short of astonishing, given that Dave came to Sloan from another business school with a distinct mission and culture … What’s more, Sloan was hardly broken — it had several strong deans leading up Dave’s arrival, a sterling reputation, and very proud traditions,” Zuckerman Sivan says.

Zuckerman Sivan, who served as MIT Sloan’s deputy dean and then as an associate dean for teaching and learning from 2015 to 2021, says it was a tremendous privilege to work for Schmittlein, and he notes that Schmittlein often saw potential in others before they saw it in themselves, including him.

“Personally, I hadn’t given a thought to becoming a dean … when Dave popped the question to me. I’m so glad he did, though, because I learned so much from the experience, not least from being able to consult with Dave and see how he thought about different managerial challenges,” Zuckerman Sivan says.

Faculty, capital projects, and international ties

Schmittlein invested in faculty compensation, and by 2012 the MIT Sloan faculty count had grown to 112.

“Dave recognized early on that growth was essential for Sloan to retain and recruit the very best faculty,” Zuckerman Sivan says. “And every move he made, especially with regard to the degree programs, was done in close and deliberate collaboration with faculty leaders. This was absolutely key. He got senior faculty at Sloan on board with the moves that he had recognized were essential for the school, such that now the moves seem obvious and organic.”

Schmittlein also oversaw several capital projects, some of which were already underway when he joined MIT Sloan. When Building E62 opened in 2010, for the first time in history all of MIT Sloan’s faculty members were housed under one roof. The Gold-certified LEED building also included six new classrooms and an executive education suite. Following that, the landmark historic buildings E60 and E52 were renovated and refreshed.

President Emerita Hockfield says that Schmittlein advanced the school in many dimensions. One area that resonates with her was his agility in building and maintaining relationships with international partners and donors. During Schmittlein’s tenure, the MIT Sloan Latin America Office opened in Santiago, Chile, in 2013, and the Asia School of Business was launched in Kuala Lumpur, Malaysia, in 2015. Schmittlein also helped to lay the groundwork for the launch of the MIT Sloan Office for Southeast Asian Nations, which opened in October 2024 in Bangkok.

The international collaborations increased the school’s visibility throughout the world. Hockfield notes that those international relationships benefited MIT Sloan students.

“For any leader today — being able to foster international relationships has to be a critical part of anyone’s toolkit,” she says. “And [for MIT Sloan students] to see that up close and personal, they can understand how they can make that happen as business leaders.”

Indeed, some MIT Sloan students were introduced firsthand to global business leaders under the guidance of both Hockfield and Schmittlein, who, for the past several years, co-taught an elective course, Corporations at the Crossroads, that featured guest speakers discussing management, strategy, and leadership.

“It was inspiring and just a lot of fun to teach that course with him … Dave possessed the wonderful combination of a brilliant intellect and a profound kindness. While he generously shared both, he more eagerly shared his kindness than his brilliance,” Hockfield says.

Ideas Made to Matter

During Schmittlein’s tenure, MIT Sloan launched a brand identity project with new messaging and the tagline “Ideas Made to Matter,” accompanied by a new website and logo. In the early 2000s, at Wharton, he had championed the online business journal Knowledge at Wharton, which went on to be a standout thought leadership publication. Under Schmittlein’s helm, MIT Sloan launched Ideas Made to Matter, a publication bringing practical insights from MIT Sloan’s faculty to global business leaders.

Hockfield recalls how Schmittlein deftly brought marketing insights to MIT Sloan. “He really understood organizational communications … and he was brilliant [at getting the MIT Sloan story out] with just the right tone,” she says.

Legacy: Principled, innovative leaders who improve the world

Lo says that Schmittlein embodied the example of a principled leader. “He was not only an amazing leader, but he was an amazing human being. He inspired all of us, and will continue to inspire all of us for years to come,” he says.

“Dave gave the Sloan School and MIT a great gift,” Lo continues. “We are now perfectly positioned to reach the next inflection point of changing the role of management education, not only at MIT but around the world.”

Hockfield says, “One of the things I deeply admired about Dave is that his personal ambitions were always secondary or tertiary to his ambitions for the school, the faculty, and the students. And that’s just a wonderful thing to behold. It brings out the best in people … I’m just so grateful that MIT had the benefit of his brilliance and curiosity for the time that we did. It’s a huge loss.”

“We are heartbroken,” MIT Provost Cynthia Barnhart says. “For nearly 17 years, the MIT community relied on and benefited from Dave Schmittlein’s inspiring vision, skillful leadership, and kind and collaborative nature. He worked tirelessly to advance MIT Sloan’s mission of developing principled, innovative leaders, all while strengthening the school’s ties to the rest of campus and building partnerships across the country and globe. He will be deeply missed by his friends and colleagues at MIT.”

Schmittlein continually searched for ways to invent and innovate. He often quoted Alfred P. Sloan, the original benefactor of MIT Sloan, who said in 1964, “I hope we all recognize that the Alfred P. Sloan School of Management is not finished. It never will be finished. It is only on its way. Nothing is finished in a world that is moving so rapidly forward …”

Schmittlein is survived by his wife of nearly 33 years, Barbara Bickart, and their children, Brigitte Schmittlein and Gabriel Schmittlein, as well as his siblings, in-laws, several nieces and nephews, and a host of lifelong friends and colleagues.

MIT Sloan is developing plans for a future celebration of Schmittlein’s life, with details for the community to come. To read more about his life and contributions, read his obituary online.

Pages