MIT Latest News
Amid rain, midterms, and the World Series, the Edgerton Center and MIT Admissions recently co-sponsored the screening of “Science Fair,” a National Geographic documentary film that follows nine high school students on their journey to compete in the 2017 Intel International Science and Engineering Fair (ISEF).
Hailed as an “ode to the teenage science geeks on whom our future depends,” the film follows the lives of nine students on their journey to the Intel ISEF in Los Angeles, where 1,700 teenage finalists from 78 countries competed for first prize.
Following the screening, a panel of five MIT students and ISEF alumni — first years Madison Sneve, Syamantak Payra, and Shinjini Ghosh and sophmores Afeefah Khazi-Syed, Ruiwen “Doris” Fu — along with ISEF judge, past competitor, and Harvard Professor Scott Kominers, discussed the far-reaching impact science fairs have had on their lives.
Moderated by Senior Assistant Director of Admissions Chris Peterson, each panelist agreed that participating in science fairs helped them find their community, dive into creative scientific research, and for some, seriously consider applying MIT.
In India, where Ghosh grew up, science fairs are not nearly as popular as they are in the U.S. The recognition she received at the 2017 Intel ISEF for developing a language identification kit propelled Ghosh to apply to MIT and made the Institute seem like a real possibility for her.
Khazi-Syed did her first mandatory science fair in 5th grade. “Starting from that age I liked the idea of experiments and testing things and getting your hands dirty,” she said.
From grade 1 on, Payra said she has conducted experiments and has “been doing science fairs every single year from 1st grade to 12th.”
Participating in science fairs was mandatory for Sneve, who attended Dupont Manual High School in Louisville, Kentucky, one of the high schools featured in the film that sent four participants to ISEF.
“Before doing research I knew I wanted to go into a science career but I had no idea what that meant in the practical sense. Without actually doing research, it’s hard to tell what that’s like, the ups and downs and if you’ll enjoy it or not. Being able to do it at high school told me how much I liked the process and how much I want to do it in the future,” Sneve remarked.
Getting to the Intel ISEF was admittedly stressful for Sneve. “There was a lot of cutthroat competition, but being able to present to professors and my peers made me feel like I was a real scientist,” she said.
For Fu, who had arrived in the U.S. from China in 9th grade, science fairs forced her to quickly find her voice.
“Doing science fair really forced me to speak and communicate,” said Fu, who appears in the film as part of the Jericho High School contingent with a science teacher of Herculean force, Serena McCalla. “I was very shy, didn’t have a lot of friends, didn’t speak in class. I had to talk to a lot of people, [and] present posters to judges in competitions.”
Khazi-Syed, who was raised by orthodox immigrant parents, identified with Kashfia, the Muslim girl portrayed in the film. Kasfhia recruited the football coach as her faculty mentor in a school where science is not nearly as valued as sports.
“I never expected to go to ISEF,” said Khazi-Syed. “It was an introduction to this whole world of science and engineering and it showed me what was out there. Before then, coming to MIT was never on the table. Figuring out what all was out there was the biggest takeaway.”
Another takeaway, she said, “was the friends I had, the people I got to talk to.”
Payra saw the science fair as not just as a place to compete but as a place to pursue a project that he was passionate about.
“Some of my projects in my junior and senior years took maybe 800 to 900 hours, the amount of failure and work that goes into that at every single step is really a ginormous amount,” Payra said.
Kominers, who serves as co-chair of the Grand Award judge for mathematics and is on the non-profit organization that runs ISEF, the Society for Science and the Public, commented on the growing expertise of research.
“We are always blown away by the students, but the projects get better every year, they’ve completely transformed,” he said. “We [the judges] see this graduate level stuff and we go 'Wow I’m not sure I could have done that in graduate school.'”
"The beauty of research is the chance for students to pursue questions that have not been answered before," says Ian A. Waitz, vice chancellor for undergraduate and graduate education. “Science fairs, at their best, provide a means for future scholars to connect, learn from one another, and gain the kinds of skills and thought processes that will serve them at places like MIT or just about anywhere they end up.”
Today the Boston Lyric Opera presents the world premiere of “Schoenberg in Hollywood,” a new opera by Tod Machover, the Muriel R. Cooper Professor of Music and Media and director of the MIT Media Lab's Opera of the Future group. Performances will run through Nov. 18.
“Schoenberg in Hollywood” is inspired by the life of Austrian composer Arnold Schoenberg after he fled Hitler’s Europe in the 1930s. After moving first to Boston and then to Los Angeles, Schoenberg sought connection with his new culture through music. He forged a friendship with famous comedian Harpo Marx, who introduced him to MGM’s Irving Thalberg, who in turn offered him the opportunity to compose a score for the film “The Good Earth.”
Schoenberg ultimately turned down the commission, rejecting the lure of more money and greater fame in favor of his artistic integrity (and after proposing highly unrealistic artistic and financial terms). In doing so, Schoenberg chose a path of fidelity to his heritage and his musical identity — a decision that pitted change against tradition, art against entertainment, and personal struggle against public action.
Machover’s opera is bookended by the Thalberg meeting, after which the fictional Schoenberg goes off to make a film about his own life. This imagined creation follows the narrative of Schoenberg’s historical journey up to a point, then diverges in a wild fantasy to imagine a different path had Schoenberg been able to reconcile opposing forces. Drawing on inspirations ranging from Jewish liturgical music to Bach and a World War I soundscape to contemporary 20th century music, Machover illustrates Schoenberg’s personal evolution through a synthesis of shifting influences.
“I immersed myself in Schoenberg’s world through his extensive — and incredible — writings, his music, his paintings, through visiting his amazing archives in Vienna, and by speaking with many people who knew him,” Machover explains. “But I grew up with Schoenberg’s music, so have been thinking about this for a very long time. It is part of me.”
Machover also drew on his own experience as a composer in a rapidly changing world to inform his interpretation of Schoenberg’s musical and personal journey.
“The work explores one man's journey to move millions to social and political action while remaining deeply thoughtful and thoroughly ethical,” Machover says. “The underlying artistic, activist, and ethical questions raised in this opera are ones that we ask every day at the Media Lab.”
The opera is also uniquely informed by Machover’s dual roles as artist and technologist. The opera blends reality and fantasy, combining live singers and actors with diverse media, and acoustic sound, with complex electronics spread throughout the theater, while incorporating physical stage effects that modify perspective and perception in unusual ways.
“The Media Lab is the only environment I know where the forms and technologies of this opera could have been imagined and developed,” Machover says.
A polymath and inventor, Schoenberg never earned a degree from any academic or musical institution, but became the top composition professor at the renowned Berlin Conservatory of Music (before being expelled immediately upon Hitler’s rise to power). His depth of knowledge informed but never limited his own musical explorations. His invention of 12-tone technique, which Schoenberg described as “a method of composing with 12 tones which are related only with one another," changed the face of Western music in the 20th century and beyond.
“He invented not only music but all kinds of unusual things, like a new notation system for tennis games (designed to annotate his son’s expert playing), contraptions to draw his own customized music manuscript paper, a curriculum to train movie composers in a purely sonic art, a painting technique to allow him to depict his inner mental state rather than outside physical features in a series of self-portraits,” says Machover. “As an intellect and creator, Schoenberg would have fit right into the Media Lab.”
In celebration not only of the opera’s premiere but also of the Media Lab’s informal adoption of Schoenberg, the lab is now hosting an exhibition on “Schoenberg in Hollywood” in the lobby gallery of Building E14. Videos and archival materials trace Schoenberg’s journey, including materials on loan from the Schoenberg Center in Vienna, most of which have never before been shown in the Boston area. The exhibition also serves as a companion to the opera, offering a listening station, a video trailer of one of the opera’s climactic moments, some of Machover’s own musical sketches, and an illustrated timeline juxtaposing events in Schoenberg’s life with scenes and sounds from Machover’s opera.
“The exhibition is a resonant companion to the opera, useful whether experienced before or after a performance,” explains Machover. “But is also meant to stand alone to introduce the art and life of this remarkable creator to the MIT community and beyond, and to tell at least a bit of the story about why this unusual new opera grew out of inspiration from Arnold Schoenberg ... and the MIT Media Lab itself.”
“Schoenberg in Hollywood” runs Nov. 14-18 at the Emerson Paramount Theater in Boston. The Media Lab’s exhibition is currently open to the public and will run through April 30.
The journey through graduate school is rarely straight and smooth. There are challenges and setbacks, students experience varying degrees of doubt and struggle, and many redefine their goals along the way. On this winding path, the guidance of a mentor can make all the difference to a student’s sanity and success. Professors Emilio Baglietto, Rebecca Saxe, and Matthew Shoulders were nominated by their graduate students as models of great mentorship, and are among the current slate of honorees for Committed to Caring (C2C).
Emilio Baglietto: Making the connection
Professor Emilio Baglietto’s “unparalleled enthusiasm” for teaching is both contagious and formative of his students’ academic development, his advisees say. “It was during his class that I felt I actually became a nuclear engineer,” one of his them remarked.
Before coming to MIT, Baglietto worked in the nuclear engineering industry and cites the experience as being highly influential of his teaching and research style. He earned his MS in Nuclear Engineering from the University of Pisa in 2002, and his PhD in Nuclear Engineering from the Tokyo Institute of Technology in 2004. Baglietto is now an Associate Professor of Nuclear Science and Engineering at MIT.
His advisees say he is especially dedicated to helping his students make connections with the broader field of nuclear engineering. This type of ‘informal advising’ (a precept that is one of the Mentoring Guideposts identified by the C2C program) helps students to feel well-grounded and sufficiently prepared for the future. One student recounts: “Baglietto consistently encourages everyone in our research group to network broadly, and has funded our entire group to attend some of the top international research conferences. He is truly committed to our professional growth and development.”
Baglietto has also prioritized fostering a friendly and inclusive work environment (another of the C2C Mentoring Guideposts). One of Baglietto’s students noted in their nomination letter that “he creates a collaborative environment in which it is understood that each person’s opinion must be treated with respect, and that each individual contributes value to the research group as a whole.”
One of the central practices Baglietto encourages in his lab–and a tenet he learned from his middle school math teacher–is “to enjoy competition.”
“Research is not that different from sport, so don’t be afraid of competing,” he urges. “Go out and challenge other groups with your ideas, make it fun … and never be afraid of challenging yourself and your ideas!”
Rebecca Saxe: Generously present
Professor Rebecca Saxe believes science is both a pleasure and a privilege. “We get to spend our time pursuing hard elusive ideas because of our own profound curiosity,” she muses. “And it is possible for that pursuit to be intrinsically motivating and satisfying.”
That approach has informed her excellent mentorship practices in MIT’s Brain and Cognitive Sciences Department, where Saxe is the John Jarve (1978) Professor of Cognitive Neuroscience and Associate Member of the McGovern Institute for Brain Research. She received her BA in Psychology and Philosophy from Oxford University and her PhD in Cognitive Science from MIT.
Saxe’s nominators write that through her research and engagement with the public, she is “an extremely generous and kind person who is invested in making the world a more generous and kind place.”
Giving with both her time and attention, Saxe fosters a friendly and inclusive work environment (one of the C2C Mentoring Guideposts). Most notably, she welcomes to her lab collaborators of diverse academic backgrounds: “For some, working in Rebecca's lab is an introduction to academic research,” one advisee said. “These opportunities are incredibly meaningful and result in a more diverse and rich lab environment.”
This attitude of inclusion does not stop with her lab or with academic colleagues. Consistently finding opportunities to engage with the broader public, one advisee remarked that Saxe “not only encourages lab members to organize and hold outreach events, but she herself gives talks and presentations to general audiences, such as TED talks, stage shows at the MIT Museum, and presentations associated with Cambridge Science Festival.” Her TED talk has been viewed nearly 3 million times and its transcription has been translated into 33 languages.
For Saxe, good mentorship is a crucial component of her own work. When responding about how she balances all of her responsibilities Saxe notes: “As a scientist, I do almost all of my real work in collaboration with students and post-docs. So, advising and mentoring is very compatible with my responsibility to do research.”
Matthew Shoulders: Thriving together
Shoulders encourages peer mentoring in his lab, emphasizing that “a key part of being a scientist is learning to mentor others effectively.” In order to implement this system of peer-support, Shoulders says, “I ask each student to develop their own mentoring skills and track record, and then I work with them to guide these efforts.”
In nominations letters submitted to C2C, students note that Shoulders “reminds us that being a good scientist involves not only stewarding the time and resources we have, but also caring for and supporting those we work with.”
Now the Whitehead Career Development Associate Professor in Chemistry, Shoulders earned his BS in chemistry from the Virginia Polytechnic Institute and State University and his PhD in organic chemistry from the University of Wisconsin-Madison. After a post-doctoral fellowship with the American Cancer Society at the Scripps Research Institute, Shoulders joined the MIT professoriate in 2012.
Among the good mentoring practices that Shoulders promotes in his lab is speaking openly and honestly about a healthy work-life balance (a Mentoring Guidepost). Shoulders says: “I encourage my co-workers to engage in ‘conscious prioritization,’ a process in which they should intentionally assess their priorities and time commitments to different aspects of their life, as well as the associated costs and benefits, several times a year.”
Shoulders provides his students with an open line of communication (another Mentoring Guidepost). “Early on, I learned that I could count on Matt to make himself available to chat about data, my latest crazy idea, or my existential crises about graduate school,” wrote one nominator. Often telling students to “just swing by his office,” Shoulders’ open invitation remains, even as his schedule becomes busier.
When asked for the best piece advice he could give to an MIT student, Shoulders offered: “Don't waste time questioning yourself and whether you belong here - you do! Instead, find colleagues that support you and be courageous in pursuing your goals, whatever they may be. MIT is an awesome place, so spend your time here taking advantage of it and not worrying about success or failure.”
More on Committed to Caring (C2C)
The Committed to Caring (C2C) program is an initiative of the Office of Graduate Education and contributes to its mission of making graduate education at MIT “empowering, exciting, holistic, and transformative.”
C2C invites graduate students from across MIT’s campus to nominate professors whom they believed to be outstanding mentors. Selection criteria for the award include the scope and reach of advisor impact on the experience of graduate students, excellence in scholarship, and demonstrated commitment to diversity and inclusion.
By recognizing the human element of graduate education, C2C seeks to encourage good advising and mentorship across MIT’s campus.
MIT Media Lab researchers have developed a wireless system that leverages the cheap RFID tags already on hundreds of billions of products to sense potential food contamination — with no hardware modifications needed. With the simple, scalable system, the researchers hope to bring food-safety detection to the general public.
Food safety incidents have made headlines around the globe for causing illness and death nearly every year for the past two decades. Back in 2008, for instance, 50,000 babies in China were hospitalized after eating infant formula adulterated with melamine, an organic compound used to make plastics, which is toxic in high concentrations. And this April, more than 100 people in Indonesia died from drinking alcohol contaminated, in part, with methanol, a toxic alcohol commonly used to dilute liquor for sale in black markets around the world.
The researchers’ system, called RFIQ, includes a reader that senses minute changes in wireless signals emitted from RFID tags when the signals interact with food. For this study they focused on baby formula and alcohol, but in the future, consumers might have their own reader and software to conduct food-safety sensing before buying virtually any product. Systems could also be implemented in supermarket back rooms or in smart fridges to continuously ping an RFID tag to automatically detect food spoilage, the researchers say.
The technology hinges on the fact that certain changes in the signals emitted from an RFID tag correspond to levels of certain contaminants within that product. A machine-learning model “learns” those correlations and, given a new material, can predict if the material is pure or tainted, and at what concentration. In experiments, the system detected baby formula laced with melamine with 96 percent accuracy, and alcohol diluted with methanol with 97 percent accuracy.
“In recent years, there have been so many hazards related to food and drinks we could have avoided if we all had tools to sense food quality and safety ourselves,” says Fadel Adib, an assistant professor at the Media Lab who is co-author on a paper describing the system, which is being presented at the ACM Workshop on Hot Topics in Networks. “We want to democratize food quality and safety, and bring it to the hands of everyone.”
The paper’s co-authors include: postdoc and first author Unsoo Ha, postdoc Yunfei Ma, visiting researcher Zexuan Zhong, and electrical engineering and computer science graduate student Tzu-Ming Hsu.
The power of “weak coupling”
Other sensors have also been developed for detecting chemicals or spoilage in food. But those are highly specialized systems, where the sensor is coated with chemicals and trained to detect specific contaminations. The Media Lab researchers instead aim for broader sensing. “We’ve moved this detection purely to the computation side, where you’re going to use the same very cheap sensor for products as varied as alcohol and baby formula,” Adib says.
RFID tags are stickers with tiny, ultra-high-frequency antennas. They come on food products and other items, and each costs around three to five cents. Traditionally, a wireless device called a reader pings the tag, which powers up and emits a unique signal containing information about the product it’s stuck to.
The researchers’ system leverages the fact that, when RFID tags power up, the small electromagnetic waves they emit travel into and are distorted by the molecules and ions of the contents in the container. This process is known as “weak coupling.” Essentially, if the material’s property changes, so do the signal properties.
A simple example of feature distortion is with a container of air versus water. If a container is empty, the RFID will always respond at around 950 megahertz. If it’s filled with water, the water absorbs some of the frequency, and its main response is around only 720 megahertz. Feature distortions get far more fine-grained with different materials and different contaminants. “That kind of information can be used to classify materials … [and] show different characteristics between impure and pure materials,” Ha says.
In the researchers’ system, a reader emits a wireless signal that powers the RFID tag on a food container. Electromagnetic waves penetrate the material inside the container and return to the reader with distorted amplitude (strength of signal) and phase (angle).
When the reader extracts the signal features, it sends those data to a machine-learning model on a separate computer. In training, the researchers tell the model which feature changes correspond to pure or impure materials. For this study, they used pure alcohol and alcohol tainted with 25, 50, 75, and 100 percent methanol; baby formula was adulterated with a varied percentage of melamine, from 0 to 30 percent.
“Then, the model will automatically learn which frequencies are most impacted by this type of impurity at this level of percentage,” Adib says. “Once we get a new sample, say, 20 percent methanol, the model extracts [the features] and weights them, and tells you, ‘I think with high accuracy that this is alcohol with 20 percent methanol.’”
Broadening the frequencies
The system’s concept derives from a technique called radio frequency spectroscopy, which excites a material with electromagnetic waves over a wide frequency and measures the various interactions to determine the material’s makeup.
But there was one major challenge in adapting this technique for the system: RFID tags only power up at a very tight bandwidth wavering around 950 megahertz. Extracting signals in that limited bandwidth wouldn’t net any useful information.
The researchers built on a sensing technique they developed earlier, called two-frequency excitation, which sends two frequencies — one for activation, and one for sensing — to measure hundreds more frequencies. The reader sends a signal at around 950 megahertz to power the RFID tag. When it activates, the reader sends another frequency that sweeps a range of frequencies from around 400 to 800 megahertz. It detects the feature changes across all these frequencies and feeds them to the reader.
“Given this response, it’s almost as if we have transformed cheap RFIDs into tiny radio frequency spectroscopes,” Adib says.
Because the shape of the container and other environmental aspects can affect the signal, the researchers are currently working on ensuring the system can account for those variables. They are also seeking to expand the system’s capabilities to detect many different contaminants in many different materials.
“We want to generalize to any environment,” Adib says. “That requires us to be very robust, because you want to learn to extract the right signals and to eliminate the impact of the environment from what’s inside the material.”
An endless wait in a crowded room. The official's impassive expression while handling a client in need.
Exasperating and sometimes infuriating public service bureaucracies are things with which Bernardo Zacka '05, a newly-appointed assistant professor of political science, is well acquainted.
"These are episodes where you feel powerless, where the authority you're dealing with doesn't appear to be a person," he says. "One's impression is dealing with the rule of nobody. But even then of course, you are still dealing with someone."
What is it like to be that person, on the other side of the counter? And what are the moral and political challenges that one encounters when performing such a role?
To find out, Zacka chose to immerse himself in the world of bureaucrats. For his doctoral research exploring the everyday moral lives of workers at an antipoverty agency, Zacka did not just observe and report. He joined as a participant, working as a receptionist at the agency over a period of eight months.
"This was an unusual step, a bit of a methodological oddity for a political theorist," Zacka admits.
His ethnographic approach to the subject, as well as an analysis that ranged across the fields of political philosophy, sociology, and anthropology, resulted in an award-winning thesis for Harvard University's Department of Government. From his doctoral work, he also fashioned a book: “When the State Meets the Street: Public Service and Moral Agency,” which was published in 2017 by Harvard University Press and received the 2018 Charles Taylor Book Award from the American Political Science Association.
Zacka's research sought to reveal how the state interacts with its citizens — not on a theoretical basis, but at the most prosaic, everyday level, where representatives of a state institution deal with clients whose needs are real and pressing.
"I wanted to understand not just how the organization functions, but how the employees went about being responsible, sensible moral agents, performing the roles they were entrusted with," he says.
To accomplish this, Zacka detailed encounters between his colleagues at a quasi-public agency in a northeastern U.S. city and clients seeking help with such services as housing, heating, food, and health care. He witnessed workers trying to meet the needs of clients in situations where government funds were short, or programs curtailed.
"These workers, who are often vilified, act in conditions where it is difficult to do the right thing, or to even figure out what that is, either because resources are limited, or because of conflicting demands placed on them," he says.
Front-line bureaucrats, Zacka continues, "often start out with good intentions — caring for clients, but caring so much they can't contend with the inevitable failures.” The effort of sustaining a public service ethos results either in burnout or in coping strategies that simplify the moral landscape, but that are troubling in their own right.
While some critics of public service bureaucracies zero in on waste and fixate on managerial solutions, Zacka points instead to the moral complexities faced by frontline workers, stuck in untenable situations with little institutional support.
Zacka's interest in the nitty gritty ways a state interacts with its citizens dates back to Lebanon's civil war, which in the 1980s shattered civil life in that nation and led to displacement for his family and many others. While he does not remember the war itself, living in its aftermath "generated all sorts of puzzles," he says. One example: "What makes for a stable society, and how does one evaluate the comparative advantages of political systems?"
He read voraciously about history and politics and excelled at math and sciences. At a career orientation in high school, he learned about MIT and its research opportunities for undergraduates. Excited by this prospect, he applied and pursued a major in electrical engineering and computer science, with a special concentration in artificial intelligence.
But Zacka's perspective shifted after taking humanities classes such as 17.01 (Justice). "I came to think that maybe I was more curious about how power operates, how we justify our actions to one another, than continuing in AI," he says.
A two-year stint consulting with McKinsey & Company after graduation gave him an understanding of the workings of organizations.
"I learned how incentives are set, and rules decided on," he says. "I became interested in organizational theory, which proved a useful angle to bring to the study of the state."
Eager to return to his long-standing preoccupation with political institutions, he entered graduate school at Harvard. There he found that most of his readings about the rule of law and the structure of democracies operated at a level of abstraction quite distant from our ordinary experiences of political institutions.
"Interactions like crossing a checkpoint, or meeting a border agent were missing from the books I read, and I was puzzled by that," he says. "So I asked a simple question: What would happen if we paid closer attention to the phenomenology of everyday encounters with the state?"
Today, as the sole political theorist in the Department of Political Science, Zacka continues to explore this question, staking out a novel place for himself in the field with his bottom-up, interdisciplinary research methodology. His next book project, involving the architecture of welfare agencies, uses photographs, films, and novels to look at how physical environments mediate interactions with the state.
MIT undergraduates might get a taste of his approach. He is now teaching 17.01, the course that helped set him on his current path.
"I'm applying my own twist, broadening the materials students are exposed to, introducing different traditions of political thought," he says. "There aren't too many people at MIT who specialize in teaching moral and political theory, and it's a privilege to help shape these areas at the Institute."
A new GIS and Data Lab for the MIT community is now open in Rotch Library, bringing together the MIT Libraries’ programs in GIS (geographic information systems) and data management services in one location and expanding resources for statistical analysis, data visualization, text mining, image processing, and virtual and augmented reality.
A redesign of the first floor of the Rotch Library addition (Building 7A) this fall allowed the libraries to expand service space as well as create a vibrant community workspace. The GIS Lab, which had previously been located on the third floor of Rotch Library, has moved to the first floor and now shares space with the libraries’ data management services. The new space combines an area for teaching and learning, open study or collaboration space, and staff offices.
“Libraries are about the intersections of people, information, and technology and how those intersections generate new knowledge and lead people to discover it,” says Howard Silver, head of data and specialized services at MIT Libraries. “The new GIS and Data Lab exemplifies how the MIT Libraries actively participate in creative work and will help us continue to explore the best uses of our resources in this evolving area.”
Other new features of the lab include:
- expanded computing capabilities and more workstations;
- increased software offerings to support statistical analysis, data visualization, text-mining, and image processing;
- open space for collaboration and study; and
- space dedicated to creating and testing virtual and augmented reality projects.
“The GIS Lab has given me the space and support to learn and improve my research in political science,” says PhD candidate Jesse Clark. “The lab has allowed me to be much more efficient with my work with their in-person help and office hours, and GIS librarians have taught me new skills and ways of looking at data. I am very excited to use the new space to lead GIS seminars as well as for my own research.”
Through Dec. 12, the lab is staffed for one-to-one GIS help Monday through Thursday, 1-5 p.m. and Friday 2-4 p.m. Data management help is available Tuesday and Thursday 2-4 p.m. and Wednesday 2-5 p.m.
New electron microscopy techniques can help solve corrosion problems that are worth millions of dollars to industrial companies, BP Amoco Chemical Company Senior Research Chemist Matthew Kulzick told the MIT Materials Research Laboratory (MRL) Materials Day Symposium last month.
“Materials science is critical. It’s really material in the financial sense,” Kulzick said. “Solutions demand timely and accurate information. If I’m going to solve a problem, I’ve got to know what’s actually going on, and to do that I need all of these different interrelated tools to be able to go in and find out what’s happening in systems that are important to us.”
New X-ray technologies and sample chambers, he said, are producing stunning images at 20-nanometer scale showing highly localized composition of materials. “The current evolution of tools is spectacular,” he told the audience at the Oct. 10 event.
Beginning in 2003, Kulzick built a new inorganic characterization capability for BP Amoco Chemical, MRL Associate Director Mark Beals said. Kulzick has been working with Nestor J. Zaluzec, a senior scientist at Argonne National Laboratory, as well as with the BP International Center for Advanced Materials, whose partners include the University of Manchester, Imperial College London, the University of Cambridge, and the University of Illinois at Urbana-Champaign.
He outlined advances in imaging technology such as the Pi Steradian Transmission X-ray Detection System developed at the U.S. Department of Energy’s Argonne National Laboratory, and advances in sample holder technology that BP developed collaboratively with Protochips that allow analysis of materials in gas or liquid filled chambers. Microscopic measurements using these holders, or cells, which can include micro-electro-mechanical systems, are called in situ techniques.
“A number of years ago we worked with Protochips, and we modified that holder technology to allow the X-rays coming out of that system to get to the detector,” Kulzick explained.
Images of a palladium and copper-based automotive catalyst from four different generations of energy-dispersive X-ray technology illustrated the evolution from images lacking in detail to a nanoscale compositional image acquired in just 2.5 seconds that shows the location of palladium in the chemical structure.
“So it’s really transformative in understanding what’s happening chemically at the nanoscale,” Kulzick said.
Placing a closed cell filled with hydrogen gas to simulate reduction of the catalyst inside a transmission electron microscope produced images that showed palladium particles remained unaffected while copper particles either migrated toward palladium particles or clustered together with other copper particles.
“We can actually observe the changes that are happening in that localized area under reduction, and this is extremely important if we really want to understand what’s happening,” Kulcizk said. “All of that diversity is occurring in what amounts to roughly a square micron of area on the surface of the material.”
“Just imagine what I could do with this kind of technology with regard to understanding how to activate a catalyst, how to regenerate a catalyst,” he said.
Techniques developed by Professor M. Grace Burke at the University of Manchester in the UK allow observation of chemical changes in a piece of metal over a period of hours such as dissolving a manganese sulfide inclusion from a small piece of stainless steel soaking in water, Kulzick said. “This proved a point for her with regards to corrosion mechanisms that are relevant in the nuclear industry where they worry about what’s initiating crack formation and which she has argued for years that attack of the manganese sulfide by water was one of the underlying mechanisms,” he said.
A significant advance for analyzing organic materials is direct electron capture cameras, Kulzick said.
“One of the problems with bombarding things with electrons is beam damage, so you want to use as little as you can with the right energies,” he added. “The direct electron capture cameras allowed us to reduce that dose.”
For example, Qian Chen, assistant professor of materials science and engineering at the University of Illinois at Urbana-Champaign, has used this enhanced sensitivity and lower dose radiation to a do a series of images at differing tilts to generate a three-dimensional image of a polymer membrane. Computational image analysis becomes important with these 3D structural images. “Without the ability to digitize that material like we’ve done, we would never be able to understand this diversity of structure and make it more rational,” Kulzick said.
Further analysis of the polymer membrane — soaked in a solution of zinc and lead — with Analytical Electron Microscope (AEM) techniques developed by Zaluzec at Argonne National Laboratory revealed that different ions enter into the polymer membrane at different locations. Kulzick said the next step is to understand how ions interact with the membrane structure and how that impacts permeation in the systems. Chen also analyzed the polymer membrane in water inside a graphene cell, Kulzick said, and that work showed swelling of the membrane.
“We hope to put all these pieces together and form a really detailed understanding of how a system like this functions,” he said.
Professor Emerita Catherine Vakar Chvany, a renowned Slavic linguist and literature scholar who played a pivotal role in advancing the study of Russian language and literature in MIT’s Foreign Languages and Literatures Section (now Global Studies and Languages), died on Oct. 19 in Watertown, Massachusetts. She was 91.
Chvany served on the MIT faculty for 26 years before her retirement in 1993.
Global Studies and Languages head Emma Teng noted that MIT’s thriving Russian studies curriculum today is a legacy of Chvany’s foundational work in the department. And, Maria Khotimsky, senior lecturer in Russian, said, “Several generations of Slavists are grateful for Professor Chvany’s inspiring mentorship, while her works in Slavic poetics and linguistics are renowned in the U.S. and internationally.”
A prolific and influential scholar
A prolific scholar, Chvany wrote "On the Syntax of Be-Sentences in Russian" (Slavica Publishers, 1975); and co-edited four volumes: "New Studies in Russian Language and Literature" (Slavica, 1987); "Morphosyntax in Slavic" (Slavica, 1980); "Slavic Transformational Syntax" (University of Michigan, 1974); and "Studies in Poetics: Commemorative Volume: Krystyna Pomorska" (Slavica Publishers, 1995).
In 1996, linguists Olga Yokoyama and Emily Klenin published an edited collection of her work, "Selected Essays of Catherine V. Chvany" (Slavica).
In her articles, Chvany took up a range of issues in linguistics, including not only variations on the verb “to be” but also hierarchies of situations in syntax of agents and subjects; definiteness in Bulgarian, English, and Russian; other issues of lexical storage and transitivity; hierarchies in Russian cases; and issues of markedness, including an important overview, “The Evolution of the Concept of Markedness from the Prague Circle to Generative Grammar.”
In literature she took up language issues in the classic "Tale of Igor's Campaign," Teffi’s poems, Nikolai Leskov’s short stories, and a novella by Aleksandr Solzhenitsyn.
From Paris to Cambridge
“Catherine Chvany was always so present that it is hard to think of her as gone,” said MIT Literature Professor Ruth Perry. “She had strong opinions and wasn't afraid to speak out about them.”
Chvany was born on April 2, 1927, in Paris, France, to émigré Russian parents. During World War II, she and her younger sister Anna were sent first to the Pyrenees and then to the United States with assistance from a courageous young Unitarian minister’s wife, Martha Sharp.
Fluent in Russian and French, Chvany quickly mastered English. She graduated from the Girls’ Latin School in Boston in 1946 and attended Radcliffe College from 1946 to 1948. She left school to marry Lawrence Chvany and raise three children, Deborah, Barbara, and Michael.
In 1961-63, she returned to school and completed her undergraduate degree in linguistics at Harvard University. She received her PhD in Slavic languages and literatures from Harvard in 1970 and began her career as an instructor of Russian language at Wellesley College in 1966.
She joined the faculty at MIT in 1967 and became an assistant professor in 1971, an associate professor in 1974, and a full professor in 1983.
Warmth, generosity, and friendship
Historian Philip Khoury, who was dean of the School of Humanities, Arts and Social Sciences during the latter years of Chvany’s time at MIT, remembered her warmly as “a wonderful colleague who loved engaging with me on language learning and how the MIT Russian language studies program worked.”
Elizabeth Wood, a professor of Russian history, recalled the warm welcome that Chvany gave her when she came to MIT in 1990: “She always loved to stop and talk at the Tuesday faculty lunches, sharing stories of her life and her love of Slavic languages.”
Chvany’s influence was broad and longstanding, in part as a result of her professional affiliations. Chvany served on the advisory or editorial boards of "Slavic and East European Journal," "Russian Language Journal," "Journal of Slavic Linguistics," "Peirce Seminar Papers," "Essays in Poetics" (United Kingdom), and "Supostavitelno ezikoznanie" (Bulgaria).
Emily Klenin, an emerita professor of Slavic languages and literature at the University of California at Los Angeles, noted that Chvany had a practice of expressing gratitude to those whom she mentored. She connected that practice to Chvany’s experience of being aided during WWII. “Her warm and open attitude toward life was reflected in her continuing interest and friendship for the young people she mentored, even when, as most eventually did, they went on to lives involving completely different academic careers or even no academic career at all,” Klenin said.
Memorial reception at MIT on November 18
Chvany is survived by her children, Deborah Gyapong and her husband Tony of Ottawa, Canada; Barbara Chvany and her husband Ken Silbert of Orinda, California; and Michael Chvany and his wife Sally of Arlington, Massachusetts; her foster-brother, William Atkinson of Cambridge, Massachusetts; six grandchildren; and nine great grandchildren.
A memorial reception will be held on Sunday, Nov. 18, from 1:30 to 4:00 p.m. in the Samberg Conference Center, 7th floor. Donations in Chvany’s name may be made to the Unitarian Universalist Association. Visit Friends of the UUA for online donations. Please RSVP to Michael Chvany, Mike@BridgeStreetProductions.com, if you plan to attend the memorial.
The fossil fuels that provide much of the world’s energy orginate in a type of rock known as kerogen, and the potential for recovering these fuels depends crucially on the size and connectedness of the rocks’ internal pore spaces.
Now, for the first time, a team of researchers at MIT and elsewhere has captured three-dimensional images of kerogen’s internal structure, with a level of detail more than 50 times greater than has been previously achieved. These images should allow more accurate predictions of how much oil or gas can be recovered from any given formation. This wouldn’t change the capability for recovering these fuels, but it could, for example, lead to better estimates of the recoverable reserves of natural gas, which is seen as an important transition fuel as the world tries to curb the use of coal and oil.
The findings are reported this week in the Proceedings of the National Academy of Science, in a paper by MIT Senior Research Scientist Roland Pellenq, MIT Professor Franz-Josef Ulm, and others at MIT, CNRS and Aix-Marseille Université (AMU) in France, and Shell Technology Center in Houston.
The team, which published results two years ago on an investigation of kerogen pore structure based on computer simulations, used a relatively new method called electron tomography to produce the new 3-D images, which have a resolution of less than 1 nanometer, or billionth of a meter. Previous attempts to study kerogen structure had never imaged the material below 50 nanometers resolution, Pellenq says.
Fossil fuels, as their name suggests, form when organic matter such as dead plants gets buried and mixed with fine-grained silt. As these materials get buried deeper, over millions of years the mix gets cooked into a mineral matrix interspersed with a mix of carbon-based molecules. Over time, with more heat and pressure, the nature of that complex structure changes.
The process, a slow pyrolysis, involves “cooking oxygen and hydrogen, and at the end, you get a piece of charcoal,” Pellenq explains. “But in between, you get this whole gradation of molecules,” many of them useful fuels, lubricants, and chemical feedstocks.
The new results show for the first time a dramatic difference in the nanostructure of kerogen depending on its age. Relatively immature kerogen (whose actual age depends of the combination of temperatures and pressures it has been subjected to) tends to have much larger pores but almost no connections among those pores, making it much harder to extract the fuel. Mature kerogen, by contrast, tends to have much tinier pores, but these are well-connected in a network that allow the gas or oil to flow easily, making much more of it recoverable, Pellenq explains.
The study also reveals that the typical pore sizes in these formations are so small that normal hydrodynamic equations used to calculate the way fluids move through porous materials won’t work. At this scale the material is in such close contact with the pore walls that interactions with the wall dominate its behavior. The research team thus had to develop new ways of calculating the flow behavior.
“There’s no fluid dynamics equation that works in these subnanoscale pores,” he says. “No continuum physics works at that scale.”
To get these detailed images of the structure, the team used electron tomography, in which a small sample of the material is rotated within the microscope as a beam of elecrons probes the structure to provide cross-sections at one angle after another. These are then combined to produce a full 3-D reconstruction of the pore structure. While scientists had been using the technique for a few years, they hadn’t applied it to kerogen structures until now. The imaging was carried out at the CINaM lab of CNRS and AMU, in France (in the group of Daniel Ferry), as part of a long-term collaboration with MultiScale Materials Science for Energy and Environment, the MIT/CNRS/AMU joint lab located at MIT.
“With this new nanoscale tomography, we can see where the hydrocarbon molecules are actually sitting inside the rock,” Pellenq says. Once they obtained the images, the researchers were able to use them together with with molecular models of the structure, to improve the fidelity of their simulations and calculations of flow rates and mechanical properties. This could shed light on how production rates decline in oil and gas wells, and perhaps on how to slow that decline.
So far, the team has studied samples from three different kerogen locations and found a strong correlation between the maturity of the formation and its pore size distribution and pore void connectivity. The researchers now hope to expand the study to many more sites and to derive a robust formula for predicting pore structure based on a given site’s maturity.
The work was supported by Royal Dutch Shell and Schlumberger through the MIT X-Shale Hub, and Total through the MIT/CNRS FASTER-Shale project.
A new approach to controlling magnetism in a microchip could open the doors to memory, computing, and sensing devices that consume drastically less power than existing versions. The approach could also overcome some of the inherent physical limitations that have been slowing progress in this area until now.
Researchers at MIT and at Brookhaven National Laboratory have demonstrated that they can control the magnetic properties of a thin-film material simply by applying a small voltage. Changes in magnetic orientation made in this way remain in their new state without the need for any ongoing power, unlike today’s standard memory chips, the team has found.
The new finding is being reported today in the journal Nature Materials, in a paper by Geoffrey Beach, a professor of materials science and engineering and co-director of the MIT Materials Research Laboratory; graduate student Aik Jun Tan; and eight others at MIT and Brookhaven.
As silicon microchips draw closer to fundamental physical limits that could cap their ability to continue increasing their capabilities while decreasing their power consumption, researchers have been exploring a variety of new technologies that might get around these limits. One of the promising alternatives is an approach called spintronics, which makes use of a property of electrons called spin, instead of their electrical charge.
Because spintronic devices can retain their magnetic properties without the need for constant power, which silicon memory chips require, they need far less power to operate. They also generate far less heat — another major limiting factor for today’s devices.
But spintronic technology suffers from its own limitations. One of the biggest missing ingredients has been a way to easily and rapidly control the magnetic properties of a material electrically, by applying a voltage. Many research groups around the world have been pursuing that challenge.
Previous attempts have relied on electron accumulation at the interface between a metallic magnet and an insulator, using a device structure similar to a capacitor. The electrical charge can change the magnetic properties of the material, but only by a very small amount, making it impractical for use in real devices. There have also been attempts at using ions instead of electrons to change magnetic properties. For instance, oxygen ions have been used to oxidize a thin layer of magnetic material, causing a extremely large changes in magnetic properties. However, the insertion and removal of oxygen ions causes the material to swell and shrink, causing mechanical damage that limits the process to just a few repetitions — rendering it essentially useless for computational devices.
The new finding demonstrates a way around that, by using hydrogen ions instead of the much larger oxygen ions used in previous attempts. Since the hydrogen ions can zip in and out very easily, the new system is much faster and provides other significant advantages, the researchers say.
Because the hydrogen ions are so much smaller, they can enter and exit from the crystalline structure of the spintronic device, changing its magnetic orientation each time, without damaging the material. In fact, the team has now demonstrated that the process produces no degradation of the material after more than 2,000 cycles. And, unlike oxygen ions, hydrogen can easily pass through metal layers, which allows the team to control properties of layers deep in a device that couldn’t be controlled in any other way.
“When you pump hydrogen toward the magnet, the magnetization rotates,” Tan says. “You can actually toggle the direction of the magnetization by 90 degrees by applying a voltage — and it’s fully reversible.” Since the orientation of the poles of the magnet is what is used to store information, this means it is possible to easily write and erase data “bits” in spintronic devices using this effect.
Beach, whose lab discovered the original process for controlling magnetism through oxygen ions several years ago, says that initial finding unleashed widespread research on a new area dubbed “magnetic ionics,” and now this newest finding has “turned on its end this whole field.”
“This is really a significant breakthrough,” says Chris Leighton, the Distinguished McKnight University Professor in the Department of Chemical Engineering and Materials Science at the University of Minnesota, who was not involved in this work. “There is currently a great deal of interest worldwide in controlling magnetic materials simply by applying electrical voltages. It’s not only interesting from the fundamental side, but it’s also a potential game-changer for applications, where magnetic materials are used to store and process digital information.”
Leighton says, “Using hydrogen insertion to control magnetism is not new, but being able to do that in a voltage-driven way, in a solid-state device, with good impact on the magnetic properties — that is pretty significant!” He adds, “this is something new, with the potential to open up additional new areas of research. … At the end of the day, controlling any type of materials function by literally flipping a switch is pretty exciting. Being able to do that quickly enough, over enough cycles, in a general way, would be a fantastic advance for science and engineering.”
Essentially, Beach explains, he and his team are “trying to make a magnetic analog of a transistor,” which can be turned on and off repeatedly without degrading its physical properties.
Just add water
The discovery came about, in part, through serendipity. While experimenting with layered magnetic materials in search of ways of changing their magnetic behavior, Tan found that the results of his experiments varied greatly from day to day for reasons that were not apparent. Eventually, by examining all the conditions during the different tests, he realized that the key difference was the humidity in the air: The experiment worked better on humid days compared to dry ones. The reason, he eventually realized, was that water molecules from the air were being split up into oxygen and hydrogen on the charged surface of the material, and while the oxygen escaped to the air, the hydrogen became ionized and was penetrating into the magnetic device — and changing its magnetism.
The device the team has produced consists of a sandwich of several thin layers, including a layer of cobalt where the magnetic changes take place, sandwiched between layers of a metal such as palladium or platinum, and with an overlay of gadolinium oxide, and then a gold layer to connect to the driving electrical voltage.
The magnetism gets switched with just a brief application of voltage and then stays put. Reversing it requires no power at all, just short-circuiting the device to connect its two sides electrically, whereas a conventional memory chip requires constant power to maintain its state. “Since you’re just applying a pulse, the power consumption can go way down,” Beach says.
The new devices, with their low power consumption and high switching speed, could eventually be especially useful for devices such mobile computing, Beach says, but the work is still at an early stage and will require further development.
“I can see lab-based prototypes within a few years or less,” he says. Making a full working memory cell is “quite complex” and might take longer, he says.
The work was supported by the National Science Foundation through the Materials Research Science and Engineering Center (MRSEC) Program.
In October, the Institute announced the creation of the MIT Stephen A. Schwarzman College of Computing, an ambitious new enterprise that will allow students to better tailor their educational interests to their goals. But the ideas driving this exciting new effort may carry a distant echo — especially among alumni were at MIT during the 1980s — from the time leadership launched another computing enterprise that dramatically changed how undergraduates and graduate students learned.
Project Athena was a campus-wide effort to make the tools of computing available to every discipline at the Institute and provide students with systematic access to computers. A new project that featured computer workstations and educational programming, Athena was a milestone in the history of distributed systems and inspired programs like Kerberos. It also revolutionized educational computing for the Institute and beyond, and created the computing environment that many students and faculty still work in today.
“Before we had [Athena], our students complained about the lack of computing in such a technology-centered institution,” says Joel Moses, an Institute Professor at MIT and one of the initial leaders of Project Athena. “Athena turned MIT into one of the most computer-rich institutions in the country.”
“The founders of of project Athena believed that computation should be used broadly by a lot of people for a lot of reasons,” says Daniela Rus, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science and director of MIT’s Computer Science and Artificial Intelligence Laboratory.
“They set out to create an education environment to empower MIT students to do just that. Since then, the MIT faculty and students have left their fingerprints all over the biggest accomplishments in the field of computing from systems to theory to artificial intelligence,” she says.
In 1983, the year Project Athena began, it was still possible for students to receive a science or engineering degree from MIT without ever having touched a computer. That was despite digital computers having been on campus since 1947, when the Navy commissioned Whirlwind I, one of the world’s first real-time computers. (It was powered by vacuum tubes.) But at the time, computers were nearly all provided by research funds which restricted their use.
Pre-Athena, MIT students who needed to use computers could work on computing systems such as CTSS. These systems did have some drawbacks, though. For one thing, students often had to wait in line at all hours of the day to do their work. In 1969, the Institute moved from CTSS to MULTICS, which was supported primarily by research funds with limited access for educational purposes. It included a timeshare aspect which meant that if students went over their allotted time, they weren’t allowed to run any more programs until the timeshare refreshed.
“(Before Athena), there was no internet access or email, no way to share files, and no standard anything. There was no @mit.edu address,” says Earll Murman, the director of the latter half of the eight-year project. “Athena changed all of that.”
Even when personal computers first started to appear on campus in the mid to late 1980s, they were still too expensive for many students. For about $1,000 a consumer could buy a computer with a 5 MB hard drive — which today is about enough space to store an MP3 of Bonnie Tyler's “Total Eclipse of the Heart,” the song that topped of the charts in Athena’s first year.
The leadership at MIT knew that as a technology-centered school, MIT needed to incorporate more computing into their education. So, in May 1983, under the care of a committee of faculty from the Department of Electrical Engineering and Computer Science — including then-EECS head Moses; Michael Dertouzos, the director of the Laboratory for Computer Science (now CSAIL); and Dean of the School of Engineering Gerald Wilson — the largest educational project ever undertaken at MIT was launched at the eventual cost of around $100 million. The project was largely paid for with funding from the Digital Equipment Corporation (DEC) and IBM.
The leaders of the project named it “Project Athena” after the ancient Greek goddess of wisdom, war, and the crafts. Unlike her namesake, however, Athena did not spring fully formed and outfitted with programs from the head of her creators. When the project started, it was ambitious and a little vague. Goals spanned from creating a cohesive network of compatible computers, to establishing a knowledge base for future decisions in educational computing, to helping students share information and learn more fully across all disciplines.
To supply the system with some clarity and direction, the committee went to the faculty and asked them to develop their own software for use in their classes and for students to work on. Projects — there were 125 in total — ranged from aerospace engineering simulations to language-learning applications to biology modeling tools.
Athena took off.
“I felt that we would know Athena was successful if we were surprised by some of the applications,” Moses says. “It turned out that our surprises were largely in the humanities.”
One such surprise was the Athena Writing Project, spearheaded by MIT professors James Paradis and Ed Barrett, which aimed to create an online classroom system for teaching scientific writing. The system allowed students to edit and annotate papers, present classwork, and turn in assignments.
Of course, in order for students to be able to use all the educational programming, there had to be enough terminals for them to access the system. That’s where Professor Jerome Saltzer came in. While much of the leadership of the project was focused on overseeing the faculty proposals and research, Saltzer stepped in as the technical director of the project in 1983 and led the effort to bring the physical workstations, made by IBM, to all students.
Luckily for Saltzer and MIT, from its inception and beyond, Project Athena was on the cutting edge of distributed systems computing at the time. The Institute found a range of willing partners in industry, such as IBM and DEC, that were willing to provide MIT with funding, technology and hardware.
Project Athena formally ended in 1991. By then the project (and computing in general) had become much more pervasive and commonplace in MIT students’ lives. There were hundreds of Athena workstations located in clusters around the campus, and students were using them to measure bloodflow, design airplane wings, practice political science debates, digital revise their humanities papers, and hundreds of other things.
Athena’s wisdom today
It has now been 27 years since Project Athena ended, but the Athena computing environment is still a part of everyday life at MIT. There are Athena clusters located around campus, with many workstations hooked up to printers and available to students 24 hours a day, seven days a week (although there are fewer workstations than there once were, and they are typically used for more specialized applications).
Though Project Athena’s main goals were educational, it had long-lasting effects on a range of technologies, products, and services that the MIT community touches every day, often without even knowing it. Athena’s impact can be seen in the integration of third-party software like Matlab into education. Its use of overlapping windows — students could be watching videos in one window, chatting with friends and classmates in another, and working on homework in a third — was the start of the X Window system, which is now common on Unix displays. Athena also led to the development of the Kerberos authentication system (named, in keeping with the Greek mythology motif, after the three-headed dog which guards the Underworld) which is now used at institutions around the world.
For Drew Houston ’05, Athena was a source of inspiration.
“With Athena, you could sit down at any of the (hundreds) of workstations on campus and your whole environment followed you around — not only your files,” he says. “When I graduated, not only did I not have that anymore, but it felt like for most people they didn't have anything like that, so I certainly saw a big opportunity to deliver that kind of experience to a much larger audience.”
The result was Dropbox, which Houston and his co-founder launched in 2008, allowing users to access their files from any synced device. “When we recruited engineers, part of our pitch was we were trying to build Athena for the rest of the world,” Houston says.
As MIT moves forward with the new college, Vice Chancellor Ian Waitz sees a parallel between the college and Project Athena. Like the new college, Athena was a way to change the structure of MIT’s education and provide a platform for students to create and problem-solve.
“One of the things that we do here is try to provide resources for people to use, and they may even use them in ways that we don't imagine,” Waitz says. “That’s a pretty broad analogy to a lot of the stuff that we do here at MIT — we bring bright people together and give them the tools and problems to solve, and they’ll go off and do it.”
“Computers have made our daily lives easier in a million ways that people don’t even notice, from online shopping to digital cameras, from antilock brakes to electronic health records, and everything in between,” adds Rus.
“Computing helps us with all the little things and it is also vital to the really big things like traveling to the stars, sequencing the human genome, making our food, medicines, and lives safer,” she says. “Through the MIT Schwarzman College of Computing we will create the education and research environment to make computing a stronger tool and find new ways to apply it.”
School of Engineering faculty are embracing the new MIT Stephen A. Schwarzman College of Computing as a bold response to the rapid evolution of computing that is altering and, in many cases, fundamentally transforming their disciplines.
Inspired by student interest in computing, MIT President L. Rafael Reif launched an assessment process more than a year ago that involved widespread engagement with key stakeholders across the MIT community. Discussions were led by President Reif, Provost Martin A. Schmidt, and Dean of the School of Engineering Anantha P. Chandrakasan with Faculty Chair Susan Silbey playing a key role.
“The creation of the college is MIT’s first major academic structural change since 1950,” says Chandrakasan, the Vannevar Bush Professor of Electrical Engineering and Computer Science. “After consulting with faculty from across engineering and throughout MIT, the need to do something timely and deeply impactful was abundantly clear. Mr. Schwarzman’s inspired and amazingly generous support was instrumental to our ability to move forward.”
The school’s eight department heads and two institute directors recently spoke of the exciting possibilities ahead as the college, which represents a $1 billion commitment, gets underway. There will be a new building, a new dean, and 50 new faculty positions located within the college and jointly with other departments across MIT.
School leadership says the college meets a significant need partly because it directly aligns with recent activities and changes in some of their own practices. For example, many departments have adapted their hiring and recruitment practices to include a heavier emphasis on selecting faculty who can work at a high level in computation along with another specialized field, says Chandrakasan. “In some ways the change has arrived,” he says. “The college is our way of building a powerful framework and environment for research and collaborations that involve computing and that are occurring across disciplines. The college remains a young idea and its vibrancy and success will depend on thoughtful input from people across MIT, which I look forward to hearing.”
At the forefront
The eye of the storm of change has undoubtedly been in the Department of Electrical Engineering and Computer Science (EECS). Faculty do research to advance core computing topics while also addressing an inundation of requests to build bridges and connect their work with other disciplines. In the last two years alone, EECS faculty have established new joint academic programs with economics and urban science and planning.
The creation of the college will provide vital support and accelerate all kinds of computing-related research and learning that is happening across the Institute, says Asu Ozdaglar, School of Engineering Distinguished Professor of Engineering and EECS department head. “With the launch of the college, we hope that MIT’s leading position in research and the education of future leaders in computing will continue and grow.”
Markus Buehler, head of the Department of Civil and Environmental Engineering and the McAfee Professor of Engineering, agrees. “We have been at the forefront of this transformation of our discipline,” he says. The increased role of computing has impacted all five of CEE’s strategic focus areas, which include ecological systems, resources, structures and design, urban systems, and global systems. As a result, the department is now planning a potential new major between CEE and computer science, and the college will help in that effort, says Buehler. “The creation of the college will serve as a key enabler,” he says.
The MIT Institute for Data, Systems, and Society is also deeply aligned with the college, says Munther Dahleh, director of IDSS and the William A. Coolidge Professor of Electrical Engineering and Computer Science. IDSS works with all five schools to promote cross-cutting education and research to advance data science and information and decision systems in order to address societal challenges in a systematic and rigorous manner. IDSS plays a “bridge” role that will prove useful to the college, Dahleh says. It has launched cross-disciplinary academic programs, hired joint faculty in three schools, and enabled collaborations across all five schools.
“The new college will provide a structure for expanding these activities, he says. “And it will create new opportunities to connect with a larger community in sciences, social science, and urban planning and architecture.”
Steeped in computing
The timing is right for the college, say the faculty. “We are excited by the growth opportunities in computing because the nuclear science and engineering disciplines are so steeped in the development and application of numerical tools,” says Dennis Whyte, the Hitachi America Professor of Engineering and head of the Department of Nuclear Science and Engineering.
The Department of Aeronautics and Astronautics has a significant number of faculty working in information engineering for aerospace systems, particularly autonomous systems, says Daniel Hastings, the Cecil and Ida Green Education Professor at MIT and incoming head of the department.
“The college will allow us to expand our research and teaching into all the ways that computing technologies are changing the aerospace enterprise,” says Hastings. Those ways include deep learning to recognize patterns for maintenance in the operation of multiple aircraft, artificial intelligence for traffic control of fleets of uninhabited flying vehicles, and intelligent robotic systems in space to service low-Earth orbit satellites, among others.
Increasingly, the tools of machine learning and artificial intelligence are being fruitfully applied to materials design problems, says Christopher A. Schuh, the Danae and Vasilis Salapatas Professor of Metallurgy and head of the Department of Materials Science and Engineering (DMSE). “Our department sees computational thinking as a critical skill set for any budding materials scientist,” he says, adding a large fraction of DMSE faculty focus on computational materials science or use computational methods in designing new materials.
“We are excited to see MIT focusing on computing broadly, and we look forward to a deep materials-centric engagement with the college,” he says.
Paula Hammond, the David H. Koch Professor in Engineering and head of the Department of Chemical Engineering, would like to see the college provide new opportunities and pathways for chemical engineering to grow. One-third of faculty in her department work with computation as their primary research method, she says.
Hammond looks forward especially to the arrival of new faculty. “I see these new positions as a chance to hire faculty members who are rooted in the molecular and systems-oriented thinking that defines our field, while doing research in new and important areas, including global problems in environment, energy, health, and water.” She says such interdisciplinary faculty would be instrumental in building a new computational major in chemical engineering (10-ENG) that is currently in development.
Douglas Lauffenburger, the Ford Professor of Bioengineering and head of MIT’s Department of Biological Engineering, expresses a similar hope. “The creation of the college is a bold step, and I'm hopeful that some of these additional faculty positions will enable a strengthening of computational biology on campus.”
Training the next generation
Faculty also spoke of how the college will enable MIT students to play leadership roles in the future of computing — and other engineering fields. “It will strengthen our ability to train the next generation of mechanical engineers and better prepare students to join the workforce by exposing them to computation and AI throughout their education,” says Evelyn N. Wang, the Gail E. Kendall Professor and head of the Department of Mechanical Engineering.
An increasing number of research fields within mechanical engineering rely on computing technologies — from smarter autonomous machines to more accurate extreme event prediction and -3D printing. “The college will help students and researchers working in these fields advance their groundbreaking research even further,” adds Wang.
Elazer Edelman, the director of the Institute for Medical Engineering and Science, says the potential is vast. “From access to critical data sets to insights derived from machine and deep learning, the college will enable all of us to better interact as a community to address important problems and to train the next batch of young stars at the interface of science, engineering, computing and medicine,” he says. Edelman is the Edward J. Poitras Professor of Medical Engineering and Science at MIT.
“We at IMES are particularly excited to work with the college in interacting as a global community of scholars from this incredibly exciting and imaginative platform,” he says.
The new MIT Stephen A. Schwarzman College of Computing will incorporate the modern tools of computing into disciplines across the Institute. “The college will equip students to be as fluent in computing and AI [artificial intelligence] as they are in their own disciplines — and ready to use these digital tools wisely and humanely to help make a better world," says MIT President Rafael Reif.
As often happens, it appears MIT students are already there. We recently spoke with six undergraduate students who are participating in the Advanced Undergraduate Research Opportunities Program (SuperUROP), and found them already thinking deeply about how new computational technologies can be put to use in fields outside of computer science. These students are working on a huge range of problems that share a common theme: Solving them will provide tangible benefits to society.
Haripriya Mehta is working to augment human creativity by using machine learning algorithms to provide potential storylines and helpful drawings for blocked artists. Upon arrival at MIT, Mehta knew she wanted to focus on assistive technology to help people. She was originally interested in prosthetics but soon realized there are more ways than one to assist people.
“I’ve always been a raconteur of sorts, whether it's writing or dancing or playing the piano, and the idea of creative blocks has always interested me,” says Mehta, a third-year student in electrical engineering and computer science. “I want to explore how we can use deep learning to assist artists when we are sort of lost. It would be almost as if you're having a conversation with another artist but instead of an artist, it's a neural net.”
Mehta described widespread application of such a machine learning model: storyboarding for artists; a creative task for elderly to stave off early onset Alzheimer’s; an early childhood education tool to help children form sentences, create stories, and draw.
Senior Christabel Jemutai Sitienei is seeking to drive financial inclusion in East Africa through artificial intelligence. Growing up she witnessed the mobile money industry spread across Kenya and fuel economic growth. More than 75 percent of adults in Kenya were able to open a bank account because of it. Now Sitienei wants to help Kenyans gain access to additional financial services and heightened business acumen.
“Born and bred in Kenya, and with my exposure to AI, I’m in a unique and privileged position to understand the problem,” she says. “I would like to design an app that informs decision making and saves money. It would change how people are building infrastructure and deploying resources.”
Sitienei came to MIT intent on studying mechanical or systems engineering. All that changed in her sophomore year when she developed a mobile app to help her parents in Kenya run their farm.
“When I started using my computing knowledge to solve my own problems, I just knew this was for me,” says Sitienei, who adopted electrical engineering and computer science (EECS) as a major. “The application I built for my parents has been so valuable to them even until now. I learned that I really like solving problems that I can relate to,” she says.
Gabe Margolis is developing machine learning methods for fast and accurate prediction of seafloor feature maps based on sparse data collected by autonomous underwater vehicles.
A third-year student in aeronautics and astronautics, Margolis knew from the outset he wanted to focus on cognitive robotics at MIT. He soon realized that EECS was not his only option. “It seems like the major that best represents my interests would be computer science — but there's actually a big application of artificial intelligence and autonomous systems in aerospace too,” he says.
“I realized the artificial intelligence aspects of aerospace engineering are really about exploring the unknown and that is something I think is really cool,” says Margolis.
Mathematics major Andy Wei is tackling machine learning and security. Wei, a fourth-year student, is combining his math and computer science skills to address things like data poisoning, which occurs when attackers inject a small amount of adversarial training data to compromise a neural network.
“If we're deploying neural networks, we better have a good understanding of how they are vulnerable to adversarial examples,” he says, describing the risks posed by inputs that are misclassified by the network but indistinguishable from natural data to the human eye.
“If people can somehow toy with the system and make some tweaks and the machine fails, that’s an important security issue to understand. I’m really excited to tackle the problem.”
And Mattie Wasiak is applying data analytics to health care. She is leveraging clinical data sets to optimize oxygen delivery to newborns. “I am excited to continue pursuing health care applications,” says Wasiak, a third-year student in electrical engineering and computer science.
“Since freshman year, I've been really interested in machine learning. I’ve been trying to determine what field exactly I want to apply it to,” she says.
Wasiak explored marketing and political science before landing on health care last semester. “I just felt like health care really resonated with me because you can see how a machine learning model that you produce can be used in the field and have an impact on people.”
For more information — and lots of other remarkable examples — visit the SuperUROP website.
Dana G. Mead PhD ’67, a prominent business leader, military officer, former White House official, and professor who served as chair of the MIT Corporation from 2003 until 2010, died on Oct. 31 in Boston.
Mead was a forward-looking leader at MIT who helped oversee a period of significant advancement, as the Institute expanded its research interests and took landmark steps to diversify the campus community, while remaining at the leading edge of engineering, science, and innovation.
During Mead’s tenure as the Corporation’s ninth chair, MIT broadened its research portfolio to include increased investment in the life sciences, and launched new centers such as the MIT Energy Initiative (MITEI). The Institute also grew its international research programs and global engagement, and, under Mead’s supervision, hired its first female president, Susan Hockfield, who was also the first life scientist to hold the position.
"When Dana Mead chaired the MIT Corporation, I was provost, so I had the immense privilege of learning from his wonderful leadership style and observing his intense commitment to sustaining MIT’s excellence, especially through bringing fresh perspectives to the Visiting Committees,” says MIT President L. Rafael Reif. “He understood as well as anyone that the Institute is a system — and that the quality of our Visiting Committees drives the quality of the whole enterprise. MIT continues to reap the benefits of his insight and thoughtful service.”
Hockfield also recalls Mead’s impact at the Institute: “I had the very great fortune to have Dana Mead at my side, as chair of the MIT Corporation, when I embarked on my service as MIT’s president,” she says. “Dana advised and encouraged me, generously sharing the prodigious wisdom he had gained over the course of a lifetime of service and leadership. He quickly became my trusted advisor. Dana deftly but unambiguously established lines of governance, strengthening the roles of both the Corporation and Institute leadership, to MIT’s great benefit.”
Additionally, Hockfield says, Mead’s personal qualities were an integral part of his leadership style.
“But even while Dana instructed us, he also amused us,” Hockfield says. “When a discussion had gone on too long, he often observed, ‘Everything has been said, but not everyone has had a chance to say it.’ With his wisdom and warmth, and his discipline and depth of curiosity, Dana Mead devotedly served MIT.”
As one product of Mead’s focus on diversity, the number of women serving as Corporation members grew by around 50 percent during his tenure, while representation by foreign members also increased by nearly 50 percent.
In 2009, Mead announced he would step aside as chairperson, in keeping with the Corporation’s by-laws, which require that members do not serve past age 75.
“I will miss working in this very vibrant and dynamic environment — the students, faculty, administrators, alumni and the like,” he said at the time. Mead then became a Corporation life member, emeritus.
Dana George Mead was born in Cresco, Iowa. His long and diverse career in leadership roles included phases in the military, government, private sector, and academia. Mead graduated from the United States Military Academy in 1957 with a BS in engineering, and then served as an Army officer in many roles for over two decades, including assignments as a troop leader in Germany, and a combatant and strategist in Vietnam.
In 1967, Mead completed his PhD studies in MIT’s Department of Political Science, having been selected for a fellowship as an officer. His dissertation, “United States peacetime strategic planning, 1920-1941: the color plans to the victory program,” examined the ways the U.S. military planned for the “next war” in the years following World War I, until the start of World War II.
After completing his academic work, Mead transitioned to government positions, including a fellowship in the White House Fellows program. He wrote military reports, and later served as the White House’s deputy director of the Domestic Council during the Nixon administration. He left Washington in 1974 to take a professorship back at the United States Military Academy.
“Other than his wife of 60 years, Nancy, and his family, our dad was most passionate about his varied and longstanding relationships with three great American institutions; MIT, West Point, and the White House Fellows Program,” says Dana Mead Jr., one of Mead’s sons.
Mead then moved into the private sector, forging a highly successful career in industrial management. Notably, for most of the 1990s, he was chair and CEO of Tenneco Inc., the conglomerate with businesses in oil and gas transmission, shipbuilding, auto parts, packaging, chemicals, and more. Mead oversaw the expansion of Tenneco’s operations across the globe and a concurrent rise in productivity and profitability at the company.
Mead recounted and analyzed many of these experiences at Tenneco in his 2000 book, “High Standards, Hard Choices,” which frankly analyzed the his time in the executive suite and offered a look at his pragmatic style.
“There is no rocket science in quality management,” Mead wrote in the book. “Getting it right the first time, satisfying customers, reducing variations in process, and continually improving products is just common sense in business.”
At the same time, he wrote, “you will discover a lot of talent buried in the organization at all levels,” and allowing talent a chance to thrive is important. Such people, he adds, “not only know how to work the valves and switches in the middle of the night, but they probably have a lot of ideas on how to do it better. You have to put your ear to the ground and listen carefully to identify the people who are looked to and respected by their peers.”
As an extension of his corporate leadership roles, Mead served terms as chair of the National Association of Manufacturers from 1995 to 1997, and of the Business Roundtable from 1998 to 1999. He also served on the boards of Pfizer, Zurich, Textron, and Cummins.
Additionally, Mead was inducted into the American Academy of Arts and Sciences in 2009, and was a member of the Council on Foreign Relations. He served on the board of the Pardee RAND Graduate School, the School for Public and Environmental Affairs at Indiana University. He also served on the National Board of Governors for the Boys and Girls Clubs of America, a long-time philanthropic interest.
Mead is survived by his wife, Nancy, as well as his two sons, Dana Jr. and Mark; his daughters-in-law D-Arcy and Susie; his brother, Michael, and sister-in-law, Anna; and seven grandsons.
Funeral services will be held at the Old Cadet Chapel at the West Point Cemetery, at the U.S. Military Academy in West Point, New York, at 10 a.m. on Nov. 20. Donations in his name can be made to the Boys and Girls Club of America.
To honor Mead’s memory, the flags on MIT’s Killian Court will fly at half-staff Nov. 13-16.
One hundred years ago on Nov. 11, 1918, the Allied Powers and Germany signed an armistice bringing to an end World War I. That bloody conflict decimated Europe and destroyed three major empires (Austrian, Russian, and Ottoman). Its aftershocks still echo in our own times.
As this day of remembrance approaches — commemorated throughout Europe as Armistice Day, and in the U.S. as Veterans Day — it is a reminder of Machiavelli's tenet that ‘‘whoever wishes to foresee the future must consult the past." Stephen Van Evera, the Ford International Professor of Political Science and an expert on the causes of war, revisits the Great War and discusses key insights for today a full century after its bitter end.
Q: Who caused the war? Do historians agree or not? Where does the debate stand?
A: My answer is: The Germans caused the war. They wanted a general European war in 1914 and deliberately brought it about. Their deed was the crime of the century. But others disagree. A hundred years later scholars still dispute which state was most responsible. Views have evolved a lot, but there is no consensus.
During 1919 to 45 most German historians blamed Russia, or Britain, or France, while deeming Germany largely innocent. Historians outside Germany generally viewed the war as an accident, for which all the European powers deserved blame. Few put primary responsibility on Germany.
Then in 1961 and 1969 German historian Fritz Fischer published books that put greatest blame on Germany. His books stirred one of the most intense historical debates we've ever seen. The firestorm was covered in the German popular press, debated at public forums attended by thousands, and discussed in the German parliament, as though the soul of Germany was at stake — which in a way it was.
Fischer and most Fischer followers argued that Germany instigated the 1914 July crisis in order to ignite a local Balkan war that would improve Germany’s power position in Europe. German leaders did not want a general European war, but they deliberately risked such a war, and lost control of events. Some Fischerites went further, arguing that Germany instigated the 1914 July crisis in order to cause a general European war, which they wanted for “preventive” reasons — they hoped to cut Russian power down to size before Russia’s military power outgrew German power — and to position Germany to seize a wider empire in Europe and Africa. Both Fischer variants assign Germany prime responsibility.
Within Germany the Fischer view holds sway today. Germans broadly take responsibility for the war. But several recent works by non-Germans reject the Fischer view, assigning Germany less responsibility than Fischer while blaming others. So the Fischer school's views predominate in Germany but elsewhere the debate continues.
Q: Why is it important for scholars to assign responsibility for World War I, or for other wars?
A: When responsibility for past war is left unassigned, chauvinist mythmakers on one or both sides will over-blame the other for causing the war while whitewashing their own responsibility. Both sides will then be angered when the other refuses to admit responsibility and apologize for violence they believe the other caused, and be further angered that the other has the gall to blame them for this violence. They may also infer that the other may resort to violence again, as its non-apology shows that it sees nothing wrong with its past violence.
The German government infused German society with self-whitewashing, other-maligning myths of this kind about World War I origins during the interwar years. These myths played a key role in fueling Hitler’s rise to power in Germany in 1933. They were devised and spread by the Kriegsschuldreferat (War Guilt Office), a secret unit in the German foreign ministry. The Kriegsschuldreferat sponsored twisted accounts of the war’s origins by nationalist German historians, underwrote mass propaganda on the war’s origins, selectively edited document collections, and worked to corrupt historical understanding abroad by exporting this propaganda to Britain, France, and the U.S. This innocence propaganda persuaded the German public that Germany had little or no responsibility for causing the war. Germans were taught instead that Britain instigated the war; then outrageously blamed Germany for the war in the Versailles treaty’s War Guilt clause; and then forced Germany to pay reparations for a war Britain itself began.
An enraging narrative for Germans who believed it. And many Germans did. Hitler’s rise to power was fueled in part by the wave of German public fear and fury that this false narrative fostered. Hitler told Germans that Germany’s neighbors had attacked Germany in 1914 without reason, and then falsely denied their crime while falsely blaming Germany. States so malicious could well attack Germany again. Germany therefore had to recover its power and strike its neighbors before they struck Germany.
After 1945 international politics in Western Europe was miraculously transformed. War became unthinkable in a region where rivers of blood had flowed for centuries. This political transformation stemmed in important part from a transformation in the teaching of international history in European schools and universities. The international history of Europe was commonized. Europeans everywhere now learned largely the same history instead of imbibing their own national myths. An important cause of war, chauvinist nationalist mythmaking, was erased. Greatest credit for this achievement goes to truthtelling German historians — including the Fischer school — and schoolteachers who documented German responsibility for World War I, World War II, and the Holocaust and explained it to the German people. By enabling a rough consensus among former belligerents on who was responsible for past violence these historians and schoolteachers played a large role in healing the wounds of the world wars and making another round of war impossible.
Nationalist/chauvinist historical mythmaking declined worldwide after World War II but it never disappeared. It still infects many places. If, like the Germans, the people of these still-infected places faced their past truthfully they would downsize their sense of victimhood to better fit the facts. Their sense of grievance and entitlement would diminish accordingly. They would be quicker to see the justice in others' claims and to grant what others deserve. Peace with their neighbors would be easier to reach and sustain. War would be easier to avoid.
Q: What consequences — past and present — arose from the impact of the Great War?
A: Like a boulder that triggers a landslide as it tumbles downhill, World War I unleashed forces that later caused even greater violence.
Without World War I there would have been no Hitler, as he rose to power on trumped up grievances that stemmed from World War I. Hence without World War I, there would have been no World War II. There also would have been no Holocaust, as the Holocaust was a particular project of the Nazi elite that other German elites would not have pursued had they ruled instead of Hitler.
Without World War I there would have been no Russian revolution; hence no Leninism or Stalinism; hence no vast massacres by Stalin — approximately 30 million murdered — and no Cold War between the Soviet Union and the West during 1947 to 1989; hence no peripheral wars in Korea, Indochina, Afghanistan, Angola, Nicaragua, El Salvador, and Cambodia, killing millions.
The moral of story is: War can be self-feeding, self-perpetuating, and self-expanding. It has fire-like properties that cause it to continue once it begins. It is hard to extinguish because, like fire, it sustains itself by generating its own heat. In this case the “heat” is mutual fear and mutual hatred born of wartime violence, and war-generated combat political ideologies, like Bolshevism, Naziism, and extremist Sunni jihadism, that see human affairs as a Darwinistic struggle that compels groups to destroy others or be destroyed themselves.
The 2018 MIT Materials Research Laboratory (MRL) Materials Day Symposium, highlighting advances in materials science and engineering, took place in Kresge Auditorium on Oct. 10.
Among the latest advances shared: Powerful new combinations of X-rays, electrical probes, and analytical computing that are yielding insights into problems as diverse as fatigue in steel and stability in solar cells.
“Fatigue in steel is a major issue; you don’t see any changes in the shape of your material, and suddenly it fails," MIT Assistant Professor C. Cem Taşan said at the event. “We are putting a lot of effort in maintenance and safety, yet still we have devastating accidents,” he said, recalling the airline incident in April 2018 when a jet engine turbine blade broke apart and shrapnel from the engine broke a plane window fatally injuring a passenger.
“The airline company basically said that component passed all the maintenance requirements. So it was checked, and they couldn’t see any kind of fatigue cracks in it,” Taşan, the Thomas B. King Career Development Professor of Metallurgy, explained. Taşan is developing new steel and other metal alloys that are safer, stronger and lighter than those currently available.
Failure in metals is a complex mix of cracks and other changes in the microstructure caused by temperature, bending, stretching, compression and other forces, but most can survive at most one of these impacts before unleashing a cascade of subtle changes that ultimately result in failure.
Design for repair
Taşan outlined progress on a vanadium-based alloy that changes back to its original state when stress is taken away, and a new type of steel that can be transformed back to its original state when heat is applied. Stress tests to measure fatigue in Taşan’s new steel showed improvement over other steels.
Underlying these findings are new nanoscale experimental techniques that Taşan employs to identify the multiple causes of failure in metal alloys. Taşan combines energy-dispersive X-ray spectroscopy and scanning electron and transmission electron microscopes to capture data on tension, bending, compression or nanoindentation of materials. These type of microscopic measurements are called in situ techniques.
Another technique studies how a metal alloy absorbs hydrogen and its effect on the metal. For example, Taşan played movies that show how plastic strain is accommodated to two phases in a high-entropy alloy.
“These techniques allow us to see how the failure process is taking place, and we use these techniques to understand the mechanism of these failure modes and potentially repair mechanisms. Finally, we use this understanding to design new alloys that utilize these mechanisms,” Taşan said. “You are trying to design a mechanism that can be used by the material over and over and over again to deal with the same type of crack that it is facing.”
Taşan’s investigations revealed three different types of crack closure mechanisms in steel: plasticity, phase transformation and crack-surface roughness. “If I want to activate all of these crack closure mechanisms, what I need to do is design a microstructure that is metastable, nano-laminate(d) and multi-phase at same time,” he said. He said the new steel alloy successfully combines all three characteristics.
Materials Research Laboratory Director Carl V. Thompson noted that how a material is made determines its structure and its properties. These properties include mechanical, electrical, optical, magnetic and many other properties. Materials science and engineering encompasses an entire cycle from designing methods for making materials through analyzing their structure and properties, to evaluating how they perform. “Ultimately most people go through this process to make materials that perform in either a new way or in a better way for systems like automobiles, your cell phone, or medical equipment,” Thompson said.
Engineering perovskite solar cells
Silvija Gradečak, professor in the Department of Materials Science and Engineering, addressed the promise and the problems of perovskite solar cells. Hybrid organic-inorganic perovskites, such as methyl ammonium lead iodide, are a class of materials that are named after their crystal structure. “They are potentially lightweight, flexible and inexpensive as photovoltaic devices,” Gradečak said.
However, perovskite solar devices tend to be unstable in water, oxygen exposure, UV irradiation, and under-voltage biasing. As many of these changes are dynamic and happen at nanoscale, understanding the structure of these materials can be complemented with information from electrical currents. “By using the electron beam, we can mimic the condition of the electron current within the device,” she said.
Gradečak uses a technique called cathodoluminescence to probe these perovskite materials. “Our cathodoluminescence setup is unique because it enables so-called hyperspectral imaging. It means that the full optical signal is detected in each point of the complementary structural image. As the beam interacts with the sample, we are detecting light, and we do this as the electron beam moves across the sample. That is specifically important for samples that are unstable as they are irradiated with the electron beam,” she says.
This technique revealed that perovskite material examined under an electron microscope while applying a voltage to the sample for one minute resulted in a dramatic current increase in the material. “That also corresponds to the I/V [current/voltage] measurements outside of the scanning electron microscope that we performed,” she said. When the voltage bias is removed, the sample relaxes back to its initial state.
“What we think is really happening is that by biasing, there are ions that are moving and they agglomerate at the edges of the sample or at the grain boundaries, and after you remove the bias, they will relax back,” Gradečak said.
Work in Gradečak’s group by Olivia Hentz PhD ’18 combined photoluminescence data with Monte Carlo simulations to extract mobility of the defects that are moving. “More interesting, and how we can apply this method, is to understand how the material’s properties are influenced by synthesis. If you synthesize the material and you change, for example, the grain size, we can think about whether these ions that are moving will have different mobilities inside of the grain versus along the grain boundaries,” Gradečak said.
Hentz found that the mobility at the grain boundaries is 1,500 times faster than in the bulk. “The ions do move in the material, they move under the biasing conditions and that mobility is very different inside of the grain and along the grain boundaries,” Gradečak said. “By engineering the material and engineering the grain size, one can influence by how much the material will be influenced during the device operation. And this result correlates with the fact that single crystalline perovskite materials are significantly more stable than polycrystalline ones.”
Transformative new tools
In the keynote address, BP Amoco Chemical Company Senior Research Chemist Matthew Kulzick detailed new X-ray technologies and sample chambers that are yielding insights into fighting metal corrosion, improving catalytic reactions and more. “The current evolution of tools is spectacular,” he said, noting the stunning images at 20-nanometer scale showing highly localized composition of materials.
MIT Nuclear Reactor Lab Director David E. Moncton discussed advances in X-ray tubes, noting that current versions of small scale X-ray tubes are about 100 times better than those of 100 years ago. X-ray source brilliance is increasing at two times Moore’s Law, which predicted the exponential growth of transistors in silicon chips, he noted.
Still Synchroton sources such as the Advanced Photon Source, a national user facility at Argonne National Laboratory, offer beam brilliance that is 12 orders of magnitude higher than X-ray tubes. “Advanced X-ray capability is the most important missing probe of matter at nano centers and materials research labs that are not located at synchrotron facilities,” he said.
Compact X-ray free-electron laser devices hold the promise of bringing synchrotron-like examination capabilities to campus research labs, Moncton said. Moncton, who was the founding director of the Advanced Photon Source, is collaborating with Associate Professor William S. Graves at Arizona State University, which is home to world’s first compact X-ray free-electron laser (CXFEL).
“The emittance is very similar to a synchrotron source,” Moncton said. “If you built a compact X-ray FEL on this compact source platform, it would outperform today’s synchrotron facilities by a number of orders of magnitude.”
X-ray phase contrast imaging has also advanced microscopy, Moncton said, displaying an image showing air bubbles in the lungs of a fruit fly. Pump-probe techniques enable studies of biological proteins performing bio-chemical processes in real time.
“Having a local synchrotron-like source would be revolutionary,” Moncton said.
Less damaging microscope
MIT professor of electrical engineering Karl Berggren described his efforts to develop a new type of electron microscope based on the quantum character of electrons to improve microscopy. One of the goals is to reduce radiation damage to biological samples from imaging them.
With support from the Gordon and Betty Moore Foundation, Berggren is collaborating on this research with professor of physics Mark Kasevich at Stanford University, professor of physics Peter Hommelhoff at the Friedrich Alexander University, Erlangen-Nürnberg, in Germany, and professor of physics Pieter Kruit at the Technical University of Delft in the Netherlands. “What we’d like to do is basically try to take advantage of the counter-intuitive quantum properties of electrons,” Berggren said.
In one approach, he employs a series of electron beam splitters and mirrors to improve the performance of scanning electron microscopes. “What we’re doing now is essentially making a test bed by which we can develop all the electron optics to try to put together a machine,” Berggren said. Along the way, his group has developed a microscope that lets you image the top and bottom of a sample at the same time.
“We know that electrons at high voltage will pass through many samples with interacting with just a small phase shift,” he said. “In fact, we want to work in that limit for imaging bio molecules.” The right combination of beam splitters could reduce electron-induced damage to the sample by 100 times, he said.
Frances M. Ross, formerly of the Research Division at the IBM T. J. Watson Research Center and a new arrival at the Department of Materials Science and Engineering this academic year, described her observations of nanowire growth in an electron microscope. This vapor-liquid-solid process was first described in 1964, but the atomic-level details of how the nanowires grow could not be observed until recent improvements in electron microscopy technique.
Showing a movie of a silicon nanowire growing from a gold-silicon catalyst droplet, Ross said, “To grow these silicon nanowires, we just put gold on silicon and heat it up. The gold and silicon automatically form droplets, in the same way that water forms droplets on a sheet of glass.” When additional silicon is then supplied, the droplets act as a catalyst and a silicon nanowire grows from each droplet. “Nanowire growth illustrates the fact that we can get a self-assembly process that is intrinsically very simple to form a structure that can be quite complex,” Ross explained. “You can see features like the atomic level structure of the nanowire and catalyst, the effect of temperature and gas environment, and even the dynamics of the growth interface and how the catalyst really works.” The silicon nanowire grows in little jumps despite a steady flow of source material, she noted, providing detailed information on the pathways by which the atoms assemble into the nanowire.
Adding nickel to this process resulted in a nickel disilicide particle embedded in the silicon nanowire — a quantum dot. “You almost expect to see unexpected things because the movies capture every point along the way as the material evolves,” Ross said. “In situ microscopy is really the only way to get these type of detailed relations between the structure, the properties, and even the catalytic activity of individual nanoscale objects.”
“We’re in a very exciting time for electron microscopy, where advances in instrumentation are helping us understand materials growth at the atomic scale,” Ross said.
Uncovering crystal structure
James LeBeau, visiting professor of materials science and engineering, explained that scanning transmission electron microscopy provides direct imaging of atomic structure using an extremely small (less than 1x10-10m) electron probe. LeBeau uses the scanning transmission electron microscope to develop and apply new ways to characterize atomic structure of materials to understand their properties. Further, he is applying machine learning to control the microscope, using an approach similar to that used to enable self-driving cars to recognize signs and lane lines.
Beyond imaging, “we can also acquire a full chemical spectrum at every single point in our dataset. This allows us to not only directly determine which atoms are in the material, but their bonding configuration as well,” LeBeau explained. He displayed an image showing lanthanum atoms sharing a sub-lattice with strontium and aluminum sharing a sub-lattice with tantalum. “These datasets become directly interpretable. You see the chemistry,” he said.
“We can even use this data to measure the atomic scale electric field,” LeBeau added, showing an image in which the color represents the electrostatic field vector and the intensity of the color represents its magnitude. LeBeau also was able to use these techniques to uncover the particular crystal structure of ferroelectric hafnium dioxide. The atomic scale insights are critical as hafnium dioxide is compatible with silicon processing technology, which will pave the way for new memory applications. “By combining different types of data, we can explain the origin or ferroelectricity in these films and really rule out alternative explanations,” he said.
Twenty graduate students and postdocs gave two-minute previews during the Materials Day Symposium, which was immediately followed by a poster session. In all, 60 presented research posters in La Sala de Puerto. The winning presenters were graduate students Vera Schroeder, Rachel C. Kurchin, Gerald J. Wang and Philipp Simons, and postdoc Mikhail Y. Shalaginov.
Metal-air batteries are one of the lightest and most compact types of batteries available, but they can have a major limitation: When not in use, they degrade quickly, as corrosion eats away at their metal electrodes. Now, MIT researchers have found a way to substantially reduce that corrosion, making it possible for such batteries to have much longer shelf lives.
While typical rechargeable lithium-ion batteries only lose about 5 percent of their charge after a month of storage, they are too costly, bulky, or heavy for many applications. Primary (nonrechargeable) aluminum-air batteries are much less expensive and more compact and lightweight, but they can lose 80 percent of their charge a month.
The MIT design overcomes the problem of corrosion in aluminum-air batteries by introducing an oil barrier between the aluminum electrode and the electrolyte — the fluid between the two battery electrodes that eats away at the aluminum when the battery is on standby. The oil is rapidly pumped away and replaced with electrolyte as soon as the battery is used. As a result, the energy loss is cut to just 0.02 percent a month — more than a thousandfold improvement.
The findings are reported today in the journal Science by former MIT graduate student Brandon J. Hopkins ’18, W.M. Keck Professor of Energy Yang Shao-Horn, and professor of mechanical engineering Douglas P. Hart.
While several other methods have been used to extend the shelf life of metal-air batteries (which can use other metals such as sodium, lithium, magnesium, zinc, or iron), these methods can sacrifice performance Hopkins says. Most of the other approaches involve replacing the electrolyte with a different, less corrosive chemical formulation, but these alternatives drastically reduce the battery power.
Other methods involve pumping the liquid electrolyte out during storage and back in before use. These methods still enable significant corrosion and can clog plumbing systems in the battery pack. Because aluminum is hydrophilic (water-attracting) even after electrolyte is drained out of the pack, the remaining electrolyte will cling to the aluminum electrode surfaces. “The batteries have complex structures, so there are many corners for electrolyte to get caught in,” which results in continued corrosion, Hopkins explains.
To demonstrate the ability of aluminum to repel oil underwater, the researchers plunged this sample of aluminum into a beaker containing a layer of oil floating on water. When the sample enters the water layer, all the oil that clung to the surface on the way down quickly falls away, showing its property of underwater oleophobicity. Courtesy of the researchers.
A key to the new system is a thin membrane placed between the battery electrodes. When the battery is in use, both sides of the membrane are filled with a liquid electrolyte, but when the battery is put on standby, oil is pumped into the side closest to the aluminum electrode, which protects the aluminum surface from the electrolyte on the other side of the membrane.
The new battery system also takes advantage of a property of aluminum called “underwater oleophobicity” — that is, when aluminum is immersed in water, it repels oil from its surface. As a result, when the battery is reactivated and electrolyte is pumped back in, the electrolyte easily displaces the oil from the aluminum surface, which restores the power capabilities of the battery. Ironically, the MIT method of corrosion suppression exploits the same property of aluminum that promotes corrosion in conventional systems.
The result is an aluminum-air prototype with a much longer shelf life than that of conventional aluminum-air batteries. The researchers showed that when the battery was repeatedly used and then put on standby for one to two days, the MIT design lasted 24 days, while the conventional design lasted for only three. Even when oil and a pumping system are included in scaled-up primary aluminum-air battery packs, they are still five times lighter and twice as compact as rechargeable lithium-ion battery packs for electric vehicles, the researchers report.
Hart explains that aluminum, besides being very inexpensive, is one of the “highest chemical energy-density storage materials we know of” — that is, it is able to store and deliver more energy per pound than almost anything else, with only bromines, which are expensive and hazardous, being comparable. He says many experts think aluminum-air batteries may be the only viable replacement for lithium-ion batteries and for gasoline in cars.
Aluminum-air batteries have been used as range extenders for electric vehicles to supplement built-in rechargeable batteries, to add many extra miles of driving when the built-in battery runs out. They are also sometimes used as power sources in remote locations or for some underwater vehicles. But while such batteries can be stored for long periods as long as they are unused, as soon as they are turned on for the first time, they start to degrade rapidly.
Such applications could greatly benefit from this new system, Hart explains, because with the existing versions, “you can’t really shut it off. You can flush it and delay the process, but you can’t really shut it off.” However, if the new system were used, for example, as a range extender in a car, “you could use it and then pull into your driveway and park it for a month, and then come back and still expect it to have a usable battery. … I really think this is a game-changer in terms of the use of these batteries.”
With the greater shelf life that could be afforded by this new system, the use of aluminum-air batteries could “extend beyond current niche applications,” says Hopkins. The team has already filed for patents on the process.
“The technique introduced here is eloquent in that it uses fundamental surface physics to estimate required oil and membrane properties, and the results demonstrate predicted performance,” says Robert Savinell, a professor of engineering at Case Western Reserve University in Ohio, who was not involved in this research. “This work may indeed mitigate the need for costly high-purity metals and alloys for primary metal-air batteries, and might reduce the complexities of electrolyte additives.”
Savinell adds, “The ability to efficiently extract usable energy from high energy-density aluminum-air batteries, especially under intermittent use conditions, will facilitate the development and improvement of technologies requiring very high energy-densities for extended operation.”
The research was supported by MIT Lincoln Laboratory.
To battle the summer heat, office and residential buildings tend to crank up the air conditioning, sending energy bills soaring. Indeed, it’s estimated that air conditioners use about 6 percent of all the electricity produced in the United States, at an annual cost of $29 billion dollars — an expense that’s sure to grow as the global thermostat climbs.
Now MIT engineers have developed a heat-rejecting film that could be applied to a building’s windows to reflect up to 70 percent of the sun’s incoming heat. The film is able to remain highly transparent below 32 degrees Celsius, or 89 degrees Fahrenheit. Above this temperature, the researchers say, the film acts as an “autonomous system” to reject heat. They estimate that if every exterior-facing window in a building were covered in this film, the building's air conditioning and energy costs could drop by 10 percent.
The film is similar to transparent plastic wrap, and its heat-rejecting properties come from tiny microparticles embedded within it. These microparticles are made from a type of phase-changing material that shrinks when exposed to temperatures of 85 degrees Fahrenheit or higher. In their more compact configurations, the microparticles give the normally transparent film a more translucent or frosted look.
Applied to windows in the summer, the film could passively cool a building while still letting in a good amount of light. Nicholas Fang, a professor of mechanical engineering at MIT, says the material provides an affordable and energy-efficient alternative to existing smart window technologies.
“Smart windows on the market currently are either not very efficient in rejecting heat from the sun, or, like some electrochromic windows, they may need more power to drive them, so you would be paying to basically turn windows opaque,” Fang says. “We thought there might be room for new optical materials and coatings, to provide better smart window options.”
Fang and his colleagues, including researchers from the University of Hong Kong, have published their results in the journal Joule.
“A fishnet in water”
Just over a year ago, Fang began collaborating with researchers at the University of Hong Kong, who were keen on finding ways to reduce the energy usage of buildings in the city, particularly in the summer months, when the region grows notoriously hot and air-conditioning usage is at its peak.
“Meeting this challenge is critical for a metropolitan area like Hong Kong, where they are under a strict deadline for energy savings,” says Fang, referring to Hong Kong’s commitment to reduce its energy use by 40 percent by the year 2025.
After some quick calculations, Fang’s students found that a significant portion of a building’s heat comes through windows, in the form of sunlight.
“It turns out that for every square meter, about 500 watts of energy in the form of heat are brought in by sunlight through a window,” Fang says. “That’s equivalent to about five light bulbs.”
Fang, whose group studies the light-scattering properties of exotic, phase-changing materials, wondered whether such optical materials could be fashioned for windows, to passively reflect a significant portion of a building’s incoming heat.
The researchers looked through the literature for “thermochromic” materials — temperature-sensitive materials that temporarily change phase, or color, in response to heat. They eventually landed on a material made from poly (N-isopropylacrylamide)-2-Aminoethylmethacrylate hydrochloride microparticles. These microparticles resemble tiny, transparent, fiber-webbed spheres and are filled with water. At temperatures of 85 F or higher, the spheres essentially squeeze out all their water and shrink into tight bundles of fibers that reflect light in a different way, turning the material translucent.
“It’s like a fishnet in water,” Fang says. “Each of those fibers making the net, by themselves, reflects a certain amount of light. But because there’s a lot of water embedded in the fishnet, each fiber is harder to see. But once you squeeze the water out, the fibers become visible.”
In previous experiments, other groups had found that while the shrunken particles could reject light relatively well, they were less successful in shielding against heat. Fang and his colleagues realized that this limitation came down to the particle size: The particles used previously shrank to a diameter of about 100 nanometers — smaller than the wavelength of infrared light — making it easy for heat to pass right through.
Instead, Fang and his colleagues expanded the molecular chain of each microparticle, so that when it shrank in response to heat, the particle’s diameter was about 500 nanometers, which Fang says is “more compatible to the infrared spectrum of solar light.”
A comfort distinction
The researchers created a solution of the heat-shielding microparticles, which they applied between two sheets of 12-by-12-inch glass to create a film-coated window. They shone light from a solar simulator onto the window to mimic incoming sunlight, and found that the film turned frosty in response to the heat. When they measured the solar irradiance transmitted through the other side of the window, the researchers found the film was able to reject 70 percent of the heat produced by the lamp.
The team also lined a small calorimetric chamber with the heat-rejecting film and measured the temperature inside the chamber as they shone light from a solar simulator through the film. Without the film, the inner temperature heated to about 102 F — “about the temperature of a high fever,” Fang notes. With the film, the inner chamber stayed at a more tolerable 93 F.
“That’s a big difference,” Fang says. “You could make a big distinction in comfort.”
“Windows have been one major bottleneck of building efficiency,” says Xiaobo Yin, an associate professor of mechanical engineering at the University of Colorado at Boulder. “Smart windows that regulate solar energy intake can potentially be a game changer. One great advantage of this work is the materials used, which substantially improve applicability and manufacturability of smart windows.”
Going forward, the team plans to conduct more tests of the film to see whether tweaking its formula and applying it in other ways might improve its heat-shielding properties.
This research was funded, in part, by the HKUST-MIT Consortium.
Schizophrenia, a brain disorder that produces hallucinations, delusions, and cognitive impairments, usually strikes during adolescence or young adulthood. While some signs can suggest that a person is at high risk for developing the disorder, there is no way to definitively diagnose it until the first psychotic episode occurs.
MIT neuroscientists working with researchers at Beth Israel Deaconess Medical Center, Brigham and Women’s Hospital, and the Shanghai Mental Health Center have now identified a pattern of brain activity correlated with development of schizophrenia, which they say could be used as a marker to diagnose the disease earlier.
“You can consider this pattern to be a risk factor. If we use these types of brain measurements, then maybe we can predict a little bit better who will end up developing psychosis, and that may also help tailor interventions,” says Guusje Collin, a visiting scientist at MIT’s McGovern Institute for Brain Research and the lead author of the paper.
The study, which appears in the journal Molecular Psychiatry on Nov. 8, was performed at the Shanghai Mental Health Center. Susan Whitfield-Gabrieli, a visiting scientist at the McGovern Institute and a professor of psychology at Northeastern University, is one of the principal investigators for the study, along with Jijun Wang of the Shanghai Mental Health Center, William Stone of Beth Israel Deaconess Medical Center, the late Larry Seidman of Beth Israel Deaconess Medical Center, and Martha Shenton of Brigham and Women’s Hospital.
Before they experience a psychotic episode, characterized by sudden changes in behavior and a loss of touch with reality, patients can experience milder symptoms such as disordered thinking. This kind of thinking can lead to behaviors such as jumping from topic to topic at random, or giving answers unrelated to the original question. Previous studies have shown that about 25 percent of people who experience these early symptoms go on to develop schizophrenia.
The research team performed the study at the Shanghai Mental Health Center because the huge volume of patients who visit the hospital annually gave them a large enough sample of people at high risk of developing schizophrenia.
The researchers followed 158 people between the ages of 13 and 34 who were identified as high-risk because they had experienced early symptoms. The team also included 93 control subjects, who did not have any risk factors. At the beginning of the study, the researchers used functional magnetic resonance imaging (fMRI) to measure a type of brain activity involving “resting state networks.” Resting state networks consist of brain regions that preferentially connect with and communicate with each other when the brain is not performing any particular cognitive task.
“We were interested in looking at the intrinsic functional architecture of the brain to see if we could detect early aberrant brain connectivity or networks in individuals who are in the clinically high-risk phase of the disorder,” Whitfield-Gabrieli says.
One year after the initial scans, 23 of the high-risk patients had experienced a psychotic episode and were diagnosed with schizophrenia. In those patients’ scans, taken before their diagnosis, the researchers found a distinctive pattern of activity that was different from the healthy control subjects and the at-risk subjects who had not developed psychosis.
For example, in most people, a part of the brain known as the superior temporal gyrus, which is involved in auditory processing, is highly connected to brain regions involved in sensory perception and motor control. However, in patients who developed psychosis, the superior temporal gyrus became more connected to limbic regions, which are involved in processing emotions. This could help explain why patients with schizophrenia usually experience auditory hallucinations, the researchers say.
Meanwhile, the high-risk subjects who did not develop psychosis showed network connectivity nearly identical to that of the healthy subjects.
This type of distinctive brain activity could be useful as an early indicator of schizophrenia, especially since it is possible that it could be seen in even younger patients. The researchers are now performing similar studies with younger at-risk populations, including children with a family history of schizophrenia.
“That really gets at the heart of how we can translate this clinically, because we can get in earlier and earlier to identify aberrant networks in the hopes that we can do earlier interventions, and possibly even prevent psychiatric disorders,” Whitfield-Gabrieli says.
She and her colleagues are now testing early interventions that could help to combat the symptoms of schizophrenia, including cognitive behavioral therapy and neural feedback. The neural feedback approach involves training patients to use mindfulness meditation to reduce activity in the superior temporal gyrus, which tends to increase before and during auditory hallucinations.
The researchers also plan to continue following the patients in the current study, and they are now analyzing some additional data on the white matter connections in the brains of these patients, to see if those connections might yield additional differences that could also serve as early indicators of disease.
The research was funded by the National Institutes of Health and the Ministry of Science and Technology of China. Collin was supported by a Marie Curie Global Fellowship grant from the European Commission.
Imagine a herd of deer grazing in the forest. Suddenly, a twig snaps nearby, and they look up from the grass. The thought of food is forgotten, and the animals are primed to respond to any threat that might appear.
MIT neuroscientists have now discovered a circuit that they believe controls the diversion of attention away from everyday pursuits, to focus on potential threats. They also found that dopamine is key to the process: It is released in the brain’s prefrontal cortex when danger is perceived, stimulating the prefrontal cortex to redirect its focus to a part of the brain that responds to threats.
“The prefrontal cortex has long been thought to be important for attention and higher cognitive functions — planning, prioritizing, decision-making. It’s as though dopamine is the signal that tells the router to switch over to sending information down the pathway for escape-related behavior,” says Kay Tye, an MIT associate professor of brain and cognitive sciences and a member of MIT’s Picower Institute for Learning and Memory.
When this circuit is off-balance, it could trigger anxious and paranoid behavior, possibly underlying some of the symptoms seen in schizophrenia, anxiety, and depression, Tye says.
Tye is the senior author of the study, which appears in the Nov. 7 issue of Nature. The lead authors are former graduate student Caitlin Vander Weele, postdoc Cody Siciliano, and research scientist Gillian Matthews.
One major role of the prefrontal cortex, which is the seat of conscious thought and other complex cognitive behavior, is to route information to different parts of the brain.
In this study, Tye identified two populations of neurons in the prefrontal cortex, based on other brain regions that they communicate with. One set of neurons sends information to the nucleus accumbens, which is involved in motivation and reward, and the other group relays information to the periaqueductal gray (PAG), which is part of the brainstem. The PAG is involved in defensive behavior such as freezing or running.
When we perceive a potentially dangerous event, a brain region called the ventral tegmental area (VTA) sends dopamine to the prefrontal cortex, and Tye and her colleagues wanted determine how dopamine affects the two populations they had identified. To achieve that, they designed an experiment where rats were trained to recognize two visual cues, one associated with sugar water and one with a mild electrical shock. Then, they explored what happened when both cues were presented at the same time.
They found that if they stimulated dopamine release at the same time that the cues were given, the rats were much more likely to freeze (their normal response to the shock cue) than to head for the port where they would receive the sugar water. If they stimulated dopamine when just one of the cues was given, the rats’ behavior was not affected, suggesting that dopamine’s role is to enhance the escape response when the animals receive conflicting information.
“The reward-associated neurons drop their spiking by a substantial amount, making it harder for you to pay attention to a reward,” Tye says.
Further experiments suggested that dopamine acts by adjusting the signal-to-noise ratio in neurons of the prefrontal cortex. “Noise” is random firing of neurons, while the “signal” is the meaningful input coming in, such as sensory information. When neurons that connect to the PAG receive dopamine at the same time as a threatening stimulus, their signal goes up and the noise decreases. The researchers aren’t sure how this happens, but they suspect that dopamine may activate other neurons that help to amplify the signals already coming into the PAG-connected neurons, and suppress the activity of neurons that project to the nucleus accumbens.
Adapted for survival
This brain circuit could help give animals a better chance of surviving a threatening situation, Tye says. Any kind of danger sign, such as the snapping twig that startles a herd of deer, or a stranger roughly bumping into you on the sidewalk, can produce a surge of dopamine in the prefrontal cortex. This dopamine then promotes enhanced vigilance.
“You would be on the defensive,” Tye says. “There may be some times that you run when you don’t need to, but more often than not, it might make sense to turn your attention to a potential threat.”
Dysregulation of this dopamine-controlled switching may contribute to neuropsychiatric disorders such as schizophrenia, Tye says. Among other effects, too much dopamine could lead the brain to weigh negative inputs too highly. This could result in paranoia, often seen in schizophrenia patients, or anxiety.
Tye now hopes to determine more precisely how dopamine affects other neurotransmitters involved in the modulation of the signal-to-noise ratio. She also plans to further explore the role of this kind of modulation in anxiety and phobias.
The research was funded by the JPB Foundation, the Picower Institute Innovation Fund, the Picower Neurological Disorders Research Fund, the Junior Faculty Development Program, the Klingenstein Foundation, a NARSAD Young Investigator Award, the New York Stem Cell Foundation, the National Institutes of Health, the NIH Director’s New Innovator Award, and the NIH Pioneer Award.