Feed aggregator

Shaping the future through systems thinking

MIT Latest News - Tue, 05/27/2025 - 3:20pm

Long before she stepped into a lab, Ananda Santos Figueiredo was stargazing in Brazil, captivated by the cosmos and feeding her curiosity of science through pop culture, books, and the internet. She was drawn to astrophysics for its blend of visual wonder and mathematics.

Even as a child, Santos sensed her aspirations reaching beyond the boundaries of her hometown. “I’ve always been drawn to STEM,” she says. “I had this persistent feeling that I was meant to go somewhere else to learn more, explore, and do more.”

Her parents saw their daughter’s ambitions as an opportunity to create a better future. The summer before her sophomore year of high school, her family moved from Brazil to Florida.  She recalls that moment as “a big leap of faith in something bigger and we had no idea how it would turn out.” She was certain of one thing: She wanted an education that was both technically rigorous and deeply expansive, one that would allow her to pursue all her passions.

At MIT, she found exactly what she was seeking in a community and curriculum that matched her curiosity and ambition. “I’ve always associated MIT with something new and exciting that was grasping towards the very best we can achieve as humans,” Santos says, emphasizing the use of technology and science to significantly impact society. “It’s a place where people aren’t afraid to dream big and work hard to make it a reality.”

As a first-generation college student, she carried the weight of financial stress and the uncertainty that comes with being the first in her family to navigate college in the U.S. But she found a sense of belonging in the MIT community. “Being a first-generation student helped me grow,” she says. “It inspired me to seek out opportunities and help support others too.”

She channeled that energy into student government roles for the undergraduate residence halls. Through Dormitory Council (DormCon) and her dormitory, Simmons Hall, her voice could help shape life on campus. She began serving as reservations chair for her dormitory but ended up becoming president of the dormitory before being elected dining chair and vice president for DormCon. She’s worked to improve dining hall operations and has planned major community events like Simmons Hall’s 20th anniversary and DormCon’s inaugural Field Day.

Now, a senior about to earn her bachelor’s degree, Santos says MIT’s motto, “mens et manus” — “mind and hand” — has deeply resonated with her from the start. “Learning here goes far beyond the classroom,” she says. “I’ve been surrounded by people who are passionate and purposeful. That energy is infectious. It’s changed how I see myself and what I believe is possible.”

Charting her own course

Initially a physics major, Santos’ academic path took a turn after a transformative internship with the World Bank’s data science lab between her sophomore and junior years. There, she used her coding skills to study the impacts of heat waves in the Philippines. The experience opened her eyes to the role technology and data can play in improving lives and broadened her view of what a STEM career could look like.

“I realized I didn’t want to just study the universe — I wanted to change it,” she says. “I wanted to join systems thinking with my interest in the humanities, to build a better world for people and communities."

When MIT launched a new major in climate system science and engineering (Course 1-12) in 2023, Santos was the first student to declare it. The interdisciplinary structure of the program, blending climate science, engineering, energy systems, and policy, gave her a framework to connect her technical skills to real-world sustainability challenges.

She tailored her coursework to align with her passions and career goals, applying her physics background (now her minor) to understand problems in climate, energy, and sustainable systems. “One of the most powerful things about the major is the breadth,” she says. “Even classes that aren’t my primary focus have expanded how I think.”

Hands-on fieldwork has been a cornerstone of her learning. During MIT’s Independent Activities Period (IAP), she studied climate impacts in Hawai’i in the IAP Course 1.091 (Traveling Research Environmental Experiences, or TREX). This year, she studied the design of sustainable polymer systems in Course 1.096/10.496 (Design of Sustainable Polymer Systems) under MISTI’s Global Classroom program. The IAP class brought her to the middle of the Amazon Rainforest to see what the future of plastic production could look like with products from the Amazon. “That experience was incredibly eye opening,” she explains. “It helped me build a bridge between my own background and the kind of problems that I want to solve in the future.”

Santos also found enjoyment beyond labs and lectures. A member of the MIT Shakespeare Ensemble since her first year, she took to the stage in her final spring production of “Henry V,” performing as both the Chorus and Kate. “The ensemble’s collaborative spirit and the way it brings centuries-old texts to life has been transformative,” she adds.

Her passion for the arts also intersected with her interest in the MIT Lecture Series Committee. She helped host a special screening of the film “Sing Sing,” in collaboration with MIT’s Educational Justice Institute (TEJI). That connection led her to enroll in a TEJI course, illustrating the surprising and meaningful ways that different parts of MIT’s ecosystem overlap. “It’s one of the beautiful things about MIT,” she says. “You stumble into experiences that deeply change you.”

Throughout her time at MIT, the community of passionate, sustainability-focused individuals has been a major source of inspiration. She’s been actively involved with the MIT Office of Sustainability’s decarbonization initiatives and participated in the Climate and Sustainability Scholars Program.

Santos acknowledges that working in sustainability can sometimes feel overwhelming. “Tackling the challenges of sustainability can be discouraging,” she says. “The urgency to create meaningful change in a short period of time can be intimidating. But being surrounded by people who are actively working on it is so much better than not working on it at all."

Looking ahead, she plans to pursue graduate studies in technology and policy, with aspirations to shape sustainable development, whether through academia, international organizations, or diplomacy.

“The most fulfilling moments I’ve had at MIT are when I’m working on hard problems while also reflecting on who I want to be, what kind of future I want to help create, and how we can be better and kinder to each other,” she says. “That’s what excites me — solving real problems that matter.”

New fuel cell could enable electric aviation

MIT Latest News - Tue, 05/27/2025 - 11:00am

Batteries are nearing their limits in terms of how much power they can store for a given weight. That’s a serious obstacle for energy innovation and the search for new ways to power airplanes, trains, and ships. Now, researchers at MIT and elsewhere have come up with a solution that could help electrify these transportation systems.

Instead of a battery, the new concept is a kind of fuel cell — which is similar to a battery but can be quickly refueled rather than recharged. In this case, the fuel is liquid sodium metal, an inexpensive and widely available commodity. The other side of the cell is just ordinary air, which serves as a source of oxygen atoms. In between, a layer of solid ceramic material serves as the electrolyte, allowing sodium ions to pass freely through, and a porous air-facing electrode helps the sodium to chemically react with oxygen and produce electricity.

In a series of experiments with a prototype device, the researchers demonstrated that this cell could carry more than three times as much energy per unit of weight as the lithium-ion batteries used in virtually all electric vehicles today. Their findings are being published today in the journal Joule, in a paper by MIT doctoral students Karen Sugano, Sunil Mair, and Saahir Ganti-Agrawal; professor of materials science and engineering Yet-Ming Chiang; and five others.

“We expect people to think that this is a totally crazy idea,” says Chiang, who is the Kyocera Professor of Ceramics. “If they didn’t, I’d be a bit disappointed because if people don’t think something is totally crazy at first, it probably isn’t going to be that revolutionary.”

And this technology does appear to have the potential to be quite revolutionary, he suggests. In particular, for aviation, where weight is especially crucial, such an improvement in energy density could be the breakthrough that finally makes electrically powered flight practical at significant scale.

“The threshold that you really need for realistic electric aviation is about 1,000 watt-hours per kilogram,” Chiang says. Today’s electric vehicle lithium-ion batteries top out at about 300 watt-hours per kilogram — nowhere near what’s needed. Even at 1,000 watt-hours per kilogram, he says, that wouldn’t be enough to enable transcontinental or trans-Atlantic flights.

That’s still beyond reach for any known battery chemistry, but Chiang says that getting to 1,000 watts per kilogram would be an enabling technology for regional electric aviation, which accounts for about 80 percent of domestic flights and 30 percent of the emissions from aviation.

The technology could be an enabler for other sectors as well, including marine and rail transportation. “They all require very high energy density, and they all require low cost,” he says. “And that’s what attracted us to sodium metal.”

A great deal of research has gone into developing lithium-air or sodium-air batteries over the last three decades, but it has been hard to make them fully rechargeable. “People have been aware of the energy density you could get with metal-air batteries for a very long time, and it’s been hugely attractive, but it’s just never been realized in practice,” Chiang says.

By using the same basic electrochemical concept, only making it a fuel cell instead of a battery, the researchers were able to get the advantages of the high energy density in a practical form. Unlike a battery, whose materials are assembled once and sealed in a container, with a fuel cell the energy-carrying materials go in and out.

The team produced two different versions of a lab-scale prototype of the system. In one, called an H cell, two vertical glass tubes are connected by a tube across the middle, which contains a solid ceramic electrolyte material and a porous air electrode. Liquid sodium metal fills the tube on one side, and air flows through the other, providing the oxygen for the electrochemical reaction at the center, which ends up gradually consuming the sodium fuel. The other prototype uses a horizontal design, with a tray of the electrolyte material holding the liquid sodium fuel. The porous air electrode, which facilitates the reaction, is affixed to the bottom of the tray. 

Tests using an air stream with a carefully controlled humidity level produced a level of more than 1,500 watt-hours per kilogram at the level of an individual “stack,” which would translate to over 1,000 watt-hours at the full system level, Chiang says.

The researchers envision that to use this system in an aircraft, fuel packs containing stacks of cells, like racks of food trays in a cafeteria, would be inserted into the fuel cells; the sodium metal inside these packs gets chemically transformed as it provides the power. A stream of its chemical byproduct is given off, and in the case of aircraft this would be emitted out the back, not unlike the exhaust from a jet engine.

But there’s a very big difference: There would be no carbon dioxide emissions. Instead the emissions, consisting of sodium oxide, would actually soak up carbon dioxide from the atmosphere. This compound would quickly combine with moisture in the air to make sodium hydroxide — a material commonly used as a drain cleaner — which readily combines with carbon dioxide to form a solid material, sodium carbonate, which in turn forms sodium bicarbonate, otherwise known as baking soda.

“There’s this natural cascade of reactions that happens when you start with sodium metal,” Chiang says. “It’s all spontaneous. We don’t have to do anything to make it happen, we just have to fly the airplane.”

As an added benefit, if the final product, the sodium bicarbonate, ends up in the ocean, it could help to de-acidify the water, countering another of the damaging effects of greenhouse gases.

Using sodium hydroxide to capture carbon dioxide has been proposed as a way of mitigating carbon emissions, but on its own, it’s not an economic solution because the compound is too expensive. “But here, it’s a byproduct,” Chiang explains, so it’s essentially free, producing environmental benefits at no cost.

Importantly, the new fuel cell is inherently safer than many other batteries, he says. Sodium metal is extremely reactive and must be well-protected. As with lithium batteries, sodium can spontaneously ignite if exposed to moisture. “Whenever you have a very high energy density battery, safety is always a concern, because if there’s a rupture of the membrane that separates the two reactants, you can have a runaway reaction,” Chiang says. But in this fuel cell, one side is just air, “which is dilute and limited. So you don’t have two concentrated reactants right next to each other. If you’re pushing for really, really high energy density, you’d rather have a fuel cell than a battery for safety reasons.”

While the device so far exists only as a small, single-cell prototype, Chiang says the system should be quite straightforward to scale up to practical sizes for commercialization. Members of the research team have already formed a company, Propel Aero, to develop the technology. The company is currently housed in MIT’s startup incubator, The Engine.

Producing enough sodium metal to enable widespread, full-scale global implementation of this technology should be practical, since the material has been produced at large scale before. When leaded gasoline was the norm, before it was phased out, sodium metal was used to make the tetraethyl lead used as an additive, and it was being produced in the U.S. at a capacity of 200,000 tons a year. “It reminds us that sodium metal was once produced at large scale and safely handled and distributed around the U.S.,” Chiang says.

What’s more, sodium primarily originates from sodium chloride, or salt, so it is abundant, widely distributed around the world, and easily extracted, unlike lithium and other materials used in today’s EV batteries.

The system they envisage would use a refillable cartridge, which would be filled with liquid sodium metal and sealed. When it’s depleted, it would be returned to a refilling station and loaded with fresh sodium. Sodium melts at 98 degrees Celsius, just below the boiling point of water, so it is easy to heat to the melting point to refuel the cartridges.

Initially, the plan is to produce a brick-sized fuel cell that can deliver about 1,000 watt-hours of energy, enough to power a large drone, in order to prove the concept in a practical form that could be used for agriculture, for example. The team hopes to have such a demonstration ready within the next year.

Sugano, who conducted much of the experimental work as part of her doctoral thesis and will now work at the startup, says that a key insight was the importance of moisture in the process. As she tested the device with pure oxygen, and then with air, she found that the amount of humidity in the air was crucial to making the electrochemical reaction efficient. The humid air resulted in the sodium producing its discharge products in liquid rather than solid form, making it much easier for these to be removed by the flow of air through the system. “The key was that we can form this liquid discharge product and remove it easily, as opposed to the solid discharge that would form in dry conditions,” she says.

Ganti-Agrawal notes that the team drew from a variety of different engineering subfields. For example, there has been much research on high-temperature sodium, but none with a system with controlled humidity. “We’re pulling from fuel cell research in terms of designing our electrode, we’re pulling from older high-temperature battery research as well as some nascent sodium-air battery research, and kind of mushing it together,” which led to the “the big bump in performance” the team has achieved, he says.

The research team also included Alden Friesen, an MIT summer intern who attends Desert Mountain High School in Scottsdale, Arizona; Kailash Raman and William Woodford of Form Energy in Somerville, Massachusetts; Shashank Sripad of And Battery Aero in California, and Venkatasubramanian Viswanathan of the University of Michigan. The work was supported by ARPA-E, Breakthrough Energy Ventures, and the National Science Foundation, and used facilities at MIT.nano.

Overlooked cells might explain the human brain’s huge storage capacity

MIT Latest News - Tue, 05/27/2025 - 10:00am

The human brain contains about 86 billion neurons. These cells fire electrical signals that help the brain store memories and send information and commands throughout the brain and the nervous system.

The brain also contains billions of astrocytes — star-shaped cells with many long extensions that allow them to interact with millions of neurons. Although they have long been thought to be mainly supportive cells, recent studies have suggested that astrocytes may play a role in memory storage and other cognitive functions.

MIT researchers have now put forth a new hypothesis for how astrocytes might contribute to memory storage. The architecture suggested by their model would help to explain the brain’s massive storage capacity, which is much greater than would be expected using neurons alone.

“Originally, astrocytes were believed to just clean up around neurons, but there’s no particular reason that evolution did not realize that, because each astrocyte can contact hundreds of thousands of synapses, they could also be used for computation,” says Jean-Jacques Slotine, an MIT professor of mechanical engineering and of brain and cognitive sciences, and an author of the new study.

Dmitry Krotov, a research staff member at the MIT-IBM Watson AI Lab and IBM Research, is the senior author of the open-access paper, which appeared May 23 in the Proceedings of the National Academy of Sciences. Leo Kozachkov PhD ’22 is the paper’s lead author.

Memory capacity

Astrocytes have a variety of support functions in the brain: They clean up debris, provide nutrients to neurons, and help to ensure an adequate blood supply.

Astrocytes also send out many thin tentacles, known as processes, which can each wrap around a single synapse — the junctions where two neurons interact with each other — to create a tripartite (three-part) synapse.

Within the past couple of years, neuroscientists have shown that if the connections between astrocytes and neurons in the hippocampus are disrupted, memory storage and retrieval are impaired.

Unlike neurons, astrocytes can’t fire action potentials, the electrical impulses that carry information throughout the brain. However, they can use calcium signaling to communicate with other astrocytes. Over the past few decades, as the resolution of calcium imaging has improved, researchers have found that calcium signaling also allows astrocytes to coordinate their activity with neurons in the synapses that they associate with.

These studies suggest that astrocytes can detect neural activity, which leads them to alter their own calcium levels. Those changes may trigger astrocytes to release gliotransmitters — signaling molecules similar to neurotransmitters — into the synapse.

“There’s a closed circle between neuron signaling and astrocyte-to-neuron signaling,” Kozachkov says. “The thing that is unknown is precisely what kind of computations the astrocytes can do with the information that they’re sensing from neurons.”

The MIT team set out to model what those connections might be doing and how they might contribute to memory storage. Their model is based on Hopfield networks — a type of neural network that can store and recall patterns.

Hopfield networks, originally developed by John Hopfield and Shun-Ichi Amari in the 1970s and 1980s, are often used to model the brain, but it has been shown that these networks can’t store enough information to account for the vast memory capacity of the human brain. A newer, modified version of a Hopfield network, known as dense associative memory, can store much more information through a higher order of couplings between more than two neurons.

However, it is unclear how the brain could implement these many-neuron couplings at a hypothetical synapse, since conventional synapses only connect two neurons: a presynaptic cell and a postsynaptic cell. This is where astrocytes come into play.

“If you have a network of neurons, which couple in pairs, there’s only a very small amount of information that you can encode in those networks,” Krotov says. “In order to build dense associative memories, you need to couple more than two neurons. Because a single astrocyte can connect to many neurons, and many synapses, it is tempting to hypothesize that there might exist an information transfer between synapses mediated by this biological cell. That was the biggest inspiration for us to look into astrocytes and led us to start thinking about how to build dense associative memories in biology.”

The neuron-astrocyte associative memory model that the researchers developed in their new paper can store significantly more information than a traditional Hopfield network — more than enough to account for the brain’s memory capacity.

Intricate connections

The extensive biological connections between neurons and astrocytes offer support for the idea that this type of model might explain how the brain’s memory storage systems work, the researchers say. They hypothesize that within astrocytes, memories are encoded by gradual changes in the patterns of calcium flow. This information is conveyed to neurons by gliotransmitters released at synapses that astrocyte processes connect to.

“By careful coordination of these two things — the spatial temporal pattern of calcium in the cell and then the signaling back to the neurons — you can get exactly the dynamics you need for this massively increased memory capacity,” Kozachkov says.

One of the key features of the new model is that it treats astrocytes as collections of processes, rather than a single entity. Each of those processes can be considered one computational unit. Because of the high information storage capabilities of dense associative memories, the ratio of the amount of information stored to the number of computational units is very high and grows with the size of the network. This makes the system not only high capacity, but also energy efficient.

“By conceptualizing tripartite synaptic domains — where astrocytes interact dynamically with pre- and postsynaptic neurons — as the brain’s fundamental computational units, the authors argue that each unit can store as many memory patterns as there are neurons in the network. This leads to the striking implication that, in principle, a neuron-astrocyte network could store an arbitrarily large number of patterns, limited only by its size,” says Maurizio De Pitta, an assistant professor of physiology at the Krembil Research Institute at the University of Toronto, who was not involved in the study.

To test whether this model might accurately represent how the brain stores memory, researchers could try to develop ways to precisely manipulate the connections between astrocytes’ processes, then observe how those manipulations affect memory function.

“We hope that one of the consequences of this work could be that experimentalists would consider this idea seriously and perform some experiments testing this hypothesis,” Krotov says.

In addition to offering insight into how the brain may store memory, this model could also provide guidance for researchers working on artificial intelligence. By varying the connectivity of the process-to-process network, researchers could generate a huge range of models that could be explored for different purposes, for instance, creating a continuum between dense associative memories and attention mechanisms in large language models.

“While neuroscience initially inspired key ideas in AI, the last 50 years of neuroscience research have had little influence on the field, and many modern AI algorithms have drifted away from neural analogies,” Slotine says. “In this sense, this work may be one of the first contributions to AI informed by recent neuroscience research.” 

The proud history and promising future of MIT’s work on manufacturing

MIT Latest News - Tue, 05/27/2025 - 10:00am

MIT’s Initiative for New Manufacturing, announced today by President Sally A. Kornbluth, is the latest installment in a grand tradition: Since its founding, MIT has worked overtime to expand U.S. manufacturing, creating jobs and economic growth.

Indeed, one of the strongest through lines in MIT history is its commitment to U.S. manufacturing, which the Institute has pursued in economic good times and lean times, during wartime and in peacetime, and across scores of industries. MIT was founded in 1861 partly to improve U.S. industrial output, and has long devised special programs to bolster it — including multiple projects in recent decades aimed at renewing U.S. manufacturing.

“We want to deliberately design high-quality human-centered manufacturing jobs that bring new life to communities across the country,” Kornbluth wrote in a letter to the Institute community this morning, announcing the Initiative for New Manufacturing. She added: “I’m convinced that there is no more important work we can do to meet the moment and serve the nation now.”

“Embedded in MIT’s core ethos”

On one level, manufacturing is in MIT’s essential DNA. The Institute’s research and education has advanced industries from construction and transportation, to defense, electronics, biosciences, chemical engineering, and more. MIT contributions to management and logistics have also helped manufacturing firms thrive.

As Kornbluth noted in today’s letter, “Frankly, it’s not too much to say that the Institute was founded in 1861 to make manufacturing better.”

The historical record shows this, too. “There is no branch of practical industry, whether in the arts of construction, manufactures or agriculture, which is not capable of being better practiced, and even of being improved in its processes,” wrote MIT’s first president, William Barton Rogers, in a proposal for a new technical university, before MIT opened its doors.

“Manufacturing is embedded in MIT's core ethos,” says Christopher Love, a chemical engineering professor and one of the leads of the Initiative for New Manufacturing.

Beyond its everyday work, MIT has created many special projects to bolster manufacturing. In 1919, under the Institute’s third president, Richard Maclaurin, MIT developed the “Tech Plan,” engaging over 200 corporate sponsors, including AT&T and General Electric, to improve their businesses; period photos show MIT students examining a General Electric factory. (Similarly, today’s Initiative for New Manufacturing contains a “Factory Observatory” among its many facets, enabling Institute students to visit manufacturers.)

“Made in America”

For a few decades after World War II, the U.S. had an especially large global lead in manufacturing. The sector also accounted for roughly a quarter of U.S. GDP for much of the 1950s, compared to about 12 percent in recent years. To be sure, other U.S. industries naturally grew; additionally, global manufacturing increased. But the U.S. still had around 20 million manufacturing jobs in 1979, compared to about 12.8 million today. The 1980s saw concerted job loss in manufacturing, and many believed the U.S. was losing its edge in key industries, including automaking and consumer electronics.

In response, MIT formed a task force on the subject, the MIT Commission on Industrial Productivity — and this group project created a bestselling book.

Made in America: Regaining the Productive Edge,” co-authored by Michael Dertouzos, Richard Lester, and Robert Solow, sold over 300,000 copies after its publication in 1989. The book closely examined U.S. manufacturing practices across eight core industries and found overly short growth horizons for firms, suboptimal technology transfer, a neglect of human resources, and more.

Solow was an apt co-author: The MIT economist produced breakthrough research in the 1950s and 1960s, based on U.S. economic data, showing that technical advances of multiple kinds were responsible for most economic growth — to a much greater extent than, say, population growth or capital expansion. “Total factor productivity,” as Solow called it, included technological innovation, education, and skill-related changes.

Solow’s work won him a Nobel Prize in 1987 and illuminated how important technology and education are to economic expansion: Growth is not largely about making more of the same stuff, but creating new things.

The 21st Century: PIE, The Engine, and INM

This century, manufacturing has had period of growth, but heavy job losses in the first decade of the 2000s. That led to a flurry of new MIT manufacturing projects and research.

For one, an Institute task force on Production in the Innovation Economy (PIE), based on two years of empirical research, found considerable potential for U.S. advanced manufacturing, but also that the country needed to improve its capacity at turning innovations into deliverable products. These finding were detailed in the book “Making in America,” written by Institute Professor Suzanne Berger, a political scientist who has long studied the industrial economy.

MIT also participated in a government initiative, the Advanced Manufacturing Partnership, to help create high-tech economic hubs in parts of the U.S. that had suffered from de-industrialization, an effort that included developing new education initiatives for industrial workers.

And in 2016, MIT first announced a creative effort to spur manufacturing directly, in the form of The Engine, a startup accelerator, innovation hub, and venture fund located adjacent to campus in Cambridge. The Engine seeks to boost promising “tough tech” startups that need time to gain traction, and has invested in dozens of promising companies.

Additionally, MIT’s Work of the Future task force, a multi-year project issuing a final report in 2020, uncovered manufacturing insights while not being solely focused on them. The task fore found that automation will not wipe away colossal numbers of jobs — but that a key issue for the future is how technology can help workers to spur productivity, while not replacing them.

MIT continues to feature a variety of long-term programs and centers focused on manufacturing. The Initiative for New Manufacturing is an outgrowth of the Manufacturing@MIT working group; MIT’s Leaders for Global Operations (LGO) program offers a joint Engineering-MBA degree with a strong focus on manufacturing; the Department of Mechanical Engineering offers an advanced manufacturing concentration; and the Industrial Liason Program develops corporate partnerships with MIT.

All told, as Kornbluth wrote in today’s letter, “Manufacturing has been a throughline in MIT’s research and education … and it’s been an essential part of our service to the nation.” 

MIT announces the Initiative for New Manufacturing

MIT Latest News - Tue, 05/27/2025 - 10:00am

MIT today launched its Initiative for New Manufacturing (INM), an Institute-wide effort to reinfuse U.S. industrial production with leading-edge technologies, bolster crucial U.S. economic sectors, and ignite job creation.

The initiative will encompass advanced research, innovative education programs, and partnership with companies across many sectors, in a bid to help transform manufacturing and elevate its impact.

“We want to work with firms big and small, in cities, small towns and everywhere in between, to help them adopt new approaches for increased productivity,” MIT President Sally A. Kornbluth wrote in a letter to the Institute community this morning. “We want to deliberately design high-quality, human-centered manufacturing jobs that bring new life to communities across the country.”

Kornbluth added: “Helping America build a future of new manufacturing is a perfect job for MIT — and I’m convinced that there is no more important work we can do to meet the moment and serve the nation now.”

The Initiative for New Manufacturing also announced its first six founding industry consortium members: Amgen, Flextronics International USA, GE Vernova, PTC, Sanofi, and Siemens. Participants in the INM Industry Consortium will support seed projects proposed by MIT researchers, initially in the area of artificial intelligence for manufacturing.

INM joins the ranks of MIT’s other presidential initiatives — including The Climate Project at MIT; MITHIC, which supports the human-centered disciplines; MIT HEALS, centered on the life sciences and health; and MGAIC, the MIT Generative AI Impact Consortium.

“There is tremendous opportunity to bring together a vibrant community working across every scale — from nanotechnology to large-scale manufacturing — and across a wide-range of applications including semiconductors, medical devices, automotive, energy systems, and biotechnology,” says Anantha Chandrakasan, MIT’s chief innovation and strategy officer and dean of engineering, who is part of the initiative’s leadership team. “MIT is uniquely positioned to harness the transformative power of digital tools and AI to shape future of manufacturing. I’m truly excited about what we can build together and the synergies this creates with other cross-cutting initiatives across the Institute.”

The initiative is just the latest MIT-centered effort in recent decades aiming to expand American manufacturing. A faculty research group wrote the 1989 bestseller “Made in America: Regaining the Productive Edge,” advocating for a renewal of manufacturing; another MIT project, called Production in the Innovation Economy, called for expanded manufacturing in the early 2010s. In 2016, MIT also founded The Engine, a venture fund investing in hardware-based “tough tech” start-ups including many with potential to became substantial manufacturing firms.

As developed, the MIT Initiative for New Manufacturing is based around four major themes:

  • Reimagining manufacturing technologies and systems: realizing breakthrough technologies and system-level approaches to advance energy production, health care, computing, transportation, consumer products, and more;
  • Elevating the productivity and experience of manufacturing: developing and deploying new digitally driven methods and tools to amplify productivity and improve the human experience of manufacturing;
  • Scaling new manufacturing: accelerating the scaling of manufacturing companies and transforming supply chains to maximize efficiency and resilience, fostering product innovation and business growth; and
  • Transforming the manufacturing base: driving the deployment of a sustainable global manufacturing ecosystem that provides compelling opportunities to workers, with major efforts focused on the U.S.

The initiative has mapped out many concrete activities and programs, which will include an Institute-wide research program on emerging technologies and other major topics; workforce and education programs; and industry engagement and participation. INM also aims to establish new labs for developing manufacturing tools and techniques; a “factory observatory” program which immerses students in manufacturing through visits to production sites; and key “pillars” focusing on areas from semiconductors and biomanufacturing to defense and aviation.

The workforce and education element of INM will include TechAMP, an MIT-created program that works with community colleges to bridge the gap between technicians and engineers; AI-driven teaching tools; professional education; and an effort to expand manufacturing education on campus in collaboration with MIT departments and degree programs.

INM’s leadership team has three faculty co-directors: John Hart, the Class of 1922 Professor and head of the Department of Mechanical Engineering; Suzanne Berger, Institute Professor at MIT and a political scientist who has conducted influential empirical studies of manufacturing; and Chris Love, the Raymond A. and Helen E. St. Laurent Professor of Chemical Engineering. The initiative’s executive director is Julie Diop.

The initiative is in the process of forming a faculty steering committee with representation from across the Institute, as well as an external advisory board. INM stems partly from the work of the Manufacturing@MIT working group, formed in 2022 to assess many of these issues.

The launch of the new initiative was previewed at a daylong MIT symposium on May 7, titled “A Vision for New Manufacturing.” The event, held before a capacity audience in MIT’s Wong Auditorium, featured over 30 speakers from a wide range of manufacturing sectors.

“The rationale for growing and transforming U.S. manufacturing has never been more urgent than it is today,” Berger said at the event. “What we are trying to build at MIT now is not just another research project. … Together, with people in this room and outside this room, we’re trying to change what’s happening in our country.”

“We need to think about the importance of manufacturing again, because it is what brings product ideas to people,” Love told MIT News. “For instance, in biotechnology, new life-saving medicines can’t reach patients without manufacturing. There is a real urgency about this issue for both economic prosperity and creating jobs. We have seen the impact for our country when we have lost our lead in manufacturing in some sectors. Biotechnology, where the U.S. has been the global leader for more than 40 years, offers the potential to promote new robust economies here, but we need to advance our capabilities in biomanufacturing to maintain our advantage in this area.”

Hart adds: “While manufacturing feels very timely today, it is of enduring importance. Manufactured products enable our daily lives and manufacturing is critical to advancing the frontiers of technology and society. Our efforts leading up to launch of the initiative revealed great excitement about manufacturing across MIT, especially from students. Working with industry — from small to large companies, and from young startups to industrial giants — will be instrumental to creating impact and realizing the vision for new manufacturing.”

In her letter to the MIT community today, Kornbluth stressed that the initiative’s goal is to drive transformation by making manufacturing more productive, resilient, and sustainable.

“We want to reimagine manufacturing technologies and systems to advance fields like energy production, health care, computing, transportation, consumer products, and more,” she wrote. “And we want to reach well beyond the shop floor to tackle challenges like how to make supply chains more resilient, and how to inform public policy to foster a broad, healthy manufacturing ecosystem that can drive decades of innovation and growth.”

Chinese-Owned VPNs

Schneier on Security - Tue, 05/27/2025 - 7:07am

One one my biggest worries about VPNs is the amount of trust users need to place in them, and how opaque most of them are about who owns them and what sorts of data they retain.

A new study found that many commercials VPNS are (often surreptitiously) owned by Chinese companies.

It would be hard for U.S. users to avoid the Chinese VPNs. The ownership of many appeared deliberately opaque, with several concealing their structure behind layers of offshore shell companies. TTP was able to determine the Chinese ownership of the 20 VPN apps being offered to Apple’s U.S. users by piecing together corporate documents from around the world. None of those apps clearly disclosed their Chinese ownership...

Is climate change a threat? It depends, says Elon Musk’s AI chatbot.

ClimateWire News - Tue, 05/27/2025 - 6:20am
The latest version of Grok is promoting fringe climate viewpoints in a way it hasn’t done before, observers say.

Clean energy industry enters ‘nightmare scenario’

ClimateWire News - Tue, 05/27/2025 - 6:18am
The Republican megabill would slow efforts to green the energy system as climate change accelerates.

Lawmakers form Heat Caucus: ‘We've had too many deaths’

ClimateWire News - Tue, 05/27/2025 - 6:17am
The House's first caucus to address extreme heat is being launched by a Democrat from the Southwest and a Republican from the Northeast.

Trump’s attacks on state climate laws could surface in court this week

ClimateWire News - Tue, 05/27/2025 - 6:16am
Lawyers for Charleston, South Carolina, and the oil and gas industry will duel over the details of a climate case in a two-day hearing.

Energy companies fuel environmental conflicts in poor nations — study

ClimateWire News - Tue, 05/27/2025 - 6:14am
Oil giants like Exxon are often connected to social disputes over land and other resources in developing countries.

Insect-based pet food, the latest byproduct of EU bureaucracy

ClimateWire News - Tue, 05/27/2025 - 6:12am
Insect producers say EU rules are choking their industry and driving it into financial ruin — with the environment paying the price.

Firefighter helps helicopters get water faster during urban fires

ClimateWire News - Tue, 05/27/2025 - 6:10am
The Heli-Hydrant is a relatively small, open tank that can be rapidly filled with water, preventing helicopters from flying to sometimes distant lakes or ponds.

Europe’s dry spring raises fears for wheat and barley harvests

ClimateWire News - Tue, 05/27/2025 - 6:09am
If the dryness persists, it would be a second consecutive season of weather-related setbacks for farmers.

Peru court rules in favor of Kichwa territorial rights in the Amazon

ClimateWire News - Tue, 05/27/2025 - 6:08am
The communities say the state denied their ancestral presence for decades, creating protected areas without consultation or consent.

Maintaining crop yields limits mitigation potential of crop-land natural climate solutions

Nature Climate Change - Mon, 05/26/2025 - 12:00am

Nature Climate Change, Published online: 26 May 2025; doi:10.1038/s41558-025-02349-3

The adoption of natural climate solutions in crop-lands, such as cover crops, no tillage and residue retention, is widely assumed to provide both climate change mitigation and crop yield benefits. We find important spatially variable trade-offs between these outcomes and demonstrate that safeguarding crop yields will substantially lower the mitigation potential of natural climate solutions.

Targeted policies to break the deadlock on heating bans

Nature Climate Change - Mon, 05/26/2025 - 12:00am

Nature Climate Change, Published online: 26 May 2025; doi:10.1038/s41558-025-02343-9

As an important policy instrument for building sector decarbonization, bans on fossil fuel-based heating face fierce opposition with doubts over their economic viability. With a unified perspective that incorporates the views of proponents and opponents, we discuss the importance of targeted policies to break the deadlock.

Post-flood selective migration interacts with media sentiment and income effects

Nature Climate Change - Mon, 05/26/2025 - 12:00am

Nature Climate Change, Published online: 26 May 2025; doi:10.1038/s41558-025-02345-7

A gap remains in understanding flood-induced migration across sociodemographic groups. This study quantifies the flood-induced inflow/outflow selective migration by education, employment and age in the United States, and reveals how media sentiment and income effect aggravate selective migration.

Friday Squid Blogging: US Naval Ship Attacked by Squid in 1978

Schneier on Security - Fri, 05/23/2025 - 5:02pm

Interesting story:

USS Stein was underway when her anti-submarine sonar gear suddenly stopped working. On returning to port and putting the ship in a drydock, engineers observed many deep scratches in the sonar dome’s rubber “NOFOUL” coating. In some areas, the coating was described as being shredded, with rips up to four feet long. Large claws were left embedded at the bottom of most of the scratches.

As usual, you can also use this squid post to talk about the security stories in the news that I haven’t covered.

Why are some rocks on the moon highly magnetic? MIT scientists may have an answer

MIT Latest News - Fri, 05/23/2025 - 2:00pm

Where did the moon’s magnetism go? Scientists have puzzled over this question for decades, ever since orbiting spacecraft picked up signs of a high magnetic field in lunar surface rocks. The moon itself has no inherent magnetism today. 

Now, MIT scientists may have solved the mystery. They propose that a combination of an ancient, weak magnetic field and a large, plasma-generating impact may have temporarily created a strong magnetic field, concentrated on the far side of the moon.

In a study appearing today in the journal Science Advances, the researchers show through detailed simulations that an impact, such as from a large asteroid, could have generated a cloud of ionized particles that briefly enveloped the moon. This plasma would have streamed around the moon and concentrated at the opposite location from the initial impact. There, the plasma would have interacted with and momentarily amplified the moon’s weak magnetic field. Any rocks in the region could have recorded signs of the heightened magnetism before the field quickly died away.

This combination of events could explain the presence of highly magnetic rocks detected in a region near the south pole, on the moon’s far side. As it happens, one of the largest impact basins — the Imbrium basin — is located in the exact opposite spot on the near side of the moon. The researchers suspect that whatever made that impact likely released the cloud of plasma that kicked off the scenario in their simulations.

“There are large parts of lunar magnetism that are still unexplained,” says lead author Isaac Narrett, a graduate student in the MIT Department of Earth, Atmospheric and Planetary Sciences (EAPS). “But the majority of the strong magnetic fields that are measured by orbiting spacecraft can be explained by this process — especially on the far side of the moon.”

Narrett’s co-authors include Rona Oran and Benjamin Weiss at MIT, along with Katarina Miljkovic at Curtin University, Yuxi Chen and Gábor Tóth at the University of Michigan at Ann Arbor, and Elias Mansbach PhD ’24 at Cambridge University. Nuno Loureiro, professor of nuclear science and engineering at MIT, also contributed insights and advice.

Beyond the sun

Scientists have known for decades that the moon holds remnants of a strong magnetic field. Samples from the surface of the moon, returned by astronauts on NASA’s Apollo missions of the 1960s and 70s, as well as global measurements of the moon taken remotely by orbiting spacecraft, show signs of remnant magnetism in surface rocks, especially on the far side of the moon.

The typical explanation for surface magnetism is a global magnetic field, generated by an internal “dynamo,” or a core of molten, churning material. The Earth today generates a magnetic field through a dynamo process, and it’s thought that the moon once may have done the same, though its much smaller core would have produced a much weaker magnetic field that may not explain the highly magnetized rocks observed, particularly on the moon’s far side.

An alternative hypothesis that scientists have tested from time to time involves a giant impact that generated plasma, which in turn amplified any weak magnetic field. In 2020, Oran and Weiss tested this hypothesis with simulations of a giant impact on the moon, in combination with the solar-generated magnetic field, which is weak as it stretches out to the Earth and moon.

In simulations, they tested whether an impact to the moon could amplify such a solar field, enough to explain the highly magnetic measurements of surface rocks. It turned out that it wasn’t, and their results seemed to rule out plasma-induced impacts as playing a role in the moon’s missing magnetism.

A spike and a jitter

But in their new study, the researchers took a different tack. Instead of accounting for the sun’s magnetic field, they assumed that the moon once hosted a dynamo that produced a magnetic field of its own, albeit a weak one. Given the size of its core, they estimated that such a field would have been about 1 microtesla, or 50 times weaker than the Earth’s field today.

From this starting point, the researchers simulated a large impact to the moon’s surface, similar to what would have created the Imbrium basin, on the moon’s near side. Using impact simulations from Katarina Miljkovic, the team then simulated the cloud of plasma that such an impact would have generated as the force of the impact vaporized the surface material. They adapted a second code, developed by collaborators at the University of Michigan, to simulate how the resulting plasma would flow and interact with the moon’s weak magnetic field.

These simulations showed that as a plasma cloud arose from the impact, some of it would have expanded into space, while the rest would stream around the moon and concentrate on the opposite side. There, the plasma would have compressed and briefly amplified the moon’s weak magnetic field. This entire process, from the moment the magnetic field was amplified to the time that it decays back to baseline, would have been incredibly fast — somewhere around 40 minutes, Narrett says.

Would this brief window have been enough for surrounding rocks to record the momentary magnetic spike? The researchers say, yes, with some help from another, impact-related effect.

They found that an Imbrium-scale impact would have sent a pressure wave through the moon, similar to a seismic shock. These waves would have converged to the other side, where the shock would have “jittered” the surrounding rocks, briefly unsettling the rocks’ electrons — the subatomic particles that naturally orient their spins to any external magnetic field. The researchers suspect the rocks were shocked just as the impact’s plasma amplified the moon’s magnetic field. As the rocks’ electrons settled back, they assumed a new orientation, in line with the momentary high magnetic field.

“It’s as if you throw a 52-card deck in the air, in a magnetic field, and each card has a compass needle,” Weiss says. “When the cards settle back to the ground, they do so in a new orientation. That’s essentially the magnetization process.”

The researchers say this combination of a dynamo plus a large impact, coupled with the impact’s shockwave, is enough to explain the moon’s highly magnetized surface rocks — particularly on the far side. One way to know for sure is to directly sample the rocks for signs of shock, and high magnetism. This could be a possibility, as the rocks lie on the far side, near the lunar south pole, where missions such as NASA’s Artemis program plan to explore.

“For several decades, there’s been sort of a conundrum over the moon’s magnetism — is it from impacts or is it from a dynamo?” Oran says. “And here we’re saying, it’s a little bit of both. And it’s a testable hypothesis, which is nice.”

The team’s simulations were carried out using the MIT SuperCloud. This research was supported, in part, by NASA. 

Pages