Feed aggregator
A new way to bring personal items to mixed reality
Think of your most prized belongings. In an increasingly virtual world, wouldn’t it be great to save a copy of that precious item and all the memories it holds?
In mixed-reality settings, you can create a digital twin of a physical item, such as an old doll. But it’s hard to replicate interactive elements, like the way it moves or the sounds it makes — the sorts of unique interactive features that made the toy distinct in the first place.
Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) sought to change that, and they have a potential solution. Their “InteRecon” program enables users to recapture real-world objects in a mobile app, and then animate them in mixed-reality environments.
This prototype could recreate the interaction functions in the physical world, such as the head motions of your favorite bobblehead, or playing a classic video on a digital version of your vintage TV. It creates more lifelike and personal digital surroundings while preserving a memory.
InteRecon’s ability to reconstruct the interactive experience of different items could make it a useful tool for teachers explaining important concepts, like demonstrating how gravity pulls an object down. It could also add a new visual component to museum exhibits, such as animating a painting or bringing a historical mannequin to life (without the scares of characters from “Night at the Museum”). Eventually, InteRecon may be able to teach a doctor’s apprentice organ surgery or a cosmetic procedure by visualizing each motion needed to complete the task.
The exciting potential of InteRecon comes from its ability to add motions or interactive functions to many different objects, according to CSAIL visiting researcher Zisu Li, lead author of a paper introducing the tool.
“While taking a picture or video is a great way to preserve a memory, those digital copies are static,” says Li, who is also a PhD student at the Hong Kong University of Science and Technology. “We found that users wanted to reconstruct personal items while preserving their interactivity to enrich their memories. With the power of mixed reality, InteRecon can make these memories live longer in virtual settings as interactive digital items.”
Li and her colleagues will present InteRecon at the 2025 ACM CHI conference on Human Factors in Computing Systems.
Making a virtual world more realistic
To make digital interactivity possible, the team first developed an iPhone app. Using your camera, you scan the item all the way around three times to ensure it’s fully captured. The 3D model can then be imported into the InteRecon mixed reality interface, where you can mark (“segment”) individual areas to select which parts of the model will be interactive (like a doll’s arms, head, torso, and legs). Alternatively, you can use the function provided by InteRecon for automatic segmentation.
The InteRecon interface can be accessed via the mixed reality headset (such as Hololens 2 and Quest). It allows you to choose a programmable motion for the part of the item you want to animate after your model is segmented.
Movement options are presented as motion demonstrations, allowing you to play around with them before deciding on one — say, a flopping motion that emulates how a bunny doll’s ears move. You can even pinch a specific part and explore different ways to animate it, like sliding, dangling, and pendulum-like turns.
Your old iPod, digitized
The team showed that InteRecon can also recapture the interface of physical electronic devices, like a vintage TV. After making a digital copy of the item, you can customize the 3D model with different interfaces.
Users can play with example widgets from different interfaces before choosing a motion: a screen (either a TV display or camera’s viewfinder), a rotating knob (for, say, adjusting the volume), an “on/off”-style button, and a slider (for changing settings on something like a DJ booth).
Li and colleagues presented an application that recreates the interactivity of a vintage TV by incorporating virtual widgets such as an “on/off” button, a screen, and a channel switch on a TV model, along with embedding old videos into it. This makes the TV model come to life. You could also upload MP3 files and add a “play button” to a 3D model of an iPod to listen to your favorite songs in mixed reality.
The researchers believe InteRecon opens up intriguing new avenues in designing lifelike virtual environments. A user study confirmed that people from different fields share this enthusiasm, viewing it as easy to learn and diverse in its ability to express the richness of users’ memories.
“One thing I really appreciate is that the items that users remember are imperfect,” says Faraz Faruqi SM ’22, another author on the paper who is also a CSAIL affiliate and MIT PhD student in electrical engineering and computer science. “InteRecon brings those imperfections into mixed reality, accurately recreating what made a personal item like a teddy bear missing a few buttons so special.”
In a related study, users imagined how this technology could be applied to professional scenarios, from teaching medical students how to perform surgeries to helping travelers and researchers log their trips, and even assisting fashion designers in experimenting with materials.
Before InteRecon is used in more advanced settings, though, the team would like to upgrade their physical simulation engine to something more precise. This would enable applications such as helping a doctor’s apprentice to learn the pinpoint accuracy needed to do certain surgical maneuvers.
Li and Faruqi may also incorporate large language models and generative models that can recreate lost personal items into 3D models via language descriptions, as well as explain the interface’s features.
As for the researchers’ next steps, Li is working toward a more automatic and powerful pipeline that can make interactivity-preserved digital twins of larger physical environments in mixed reality for end users, such as a virtual office space. Faruqi is looking to build an approach that can physically recreate lost items via 3D printers.
“InteRecon represents an exciting new frontier in the field of mixed reality, going beyond mere visual replication to capture the unique interactivity of physical objects,” says Hanwang Zhang, an associate professor at Nanyang Technological University's College of Computing and Data Science, who wasn’t involved in the research. “This technology has the potential to revolutionize education, health care, and cultural exhibitions by bringing a new level of immersion and personal connection to virtual environments.”
Li and Faruqi wrote the paper with the Hong Kong University of Science and Technology (HKUST) master’s student Jiawei Li, PhD student Shumeng Zhang, Associate Professor Xiaojuan Ma, and assistant professors Mingming Fan and Chen Liang from HKUST; ETH Zurich PhD student Zeyu Xiong; and Stefanie Mueller, the TIBCO Career Development Associate Professor in the MIT departments of Electrical Engineering and Computer Science and Mechanical Engineering, and leader of the HCI Engineering Group. Their work was supported by the APEX Lab of The Hong Kong University of Science and Technology (Guangzhou) in collaboration with the HCI Engineering Group.
The human body, its movement, and music
Watching and listening to a pianist’s performance is an immersive and enjoyable experience. The pianist and the instrument, with a blend of skill, training, and presence, create a series of memorable moments for themselves and the audience. But is there a way to improve the performance and our understanding of how the performer and their instrument work together to create this magic, while also minimizing performance-related injuries?
Mi-Eun Kim, director of keyboard studies in MIT’s Music and Theater Arts Section, and Praneeth Namburi PhD ’16, a research scientist in MIT’s Institute for Medical Engineering and Science, are investigating how the body works when pianists play. Their joint project, The Biomechanics of Assimilating a New Piano Skill, aims to develop mechanistic insights that could transform how we understand and teach piano technique, reduce performance-related injuries, and bridge the gap between artistic expression and biomechanical efficiency.
Their project is among those recently selected for a SHASS+ Connectivity Fund grant through the MIT Human Insight Collaborative.
“The project emerged from a convergence of interests and personal experiences,” Namburi says. “Mi-Eun witnessed widespread injuries among fellow pianists and saw how these injuries could derail careers.”
Kim is a renowned pianist who has performed on stages throughout the United States, in Europe, and in Asia. She earned the Liszt-Garrison Competition’s Liszt Award and the Corpus Christi solo prize, among other honors. She teaches piano and chamber music through MIT Music’s Emerson/Harris Program and chamber music through MIT’s Chamber Music Society. She earned advanced degrees from the University of Michigan and holds a bachelor of arts degree in history from Columbia University.
Namburi’s work focuses on the biomechanics of efficient, expressive, and coordinated movement. He draws inspiration from artists and athletes in specialized movement disciplines, such as dancing and fencing, to investigate skilled movement. He earned a PhD in experimental neuroscience from MIT and a bachelor of engineering degree in electrical and electronic engineering from Singapore’s Nanyang Technological University.
Pursuing the project
Kim and Namburi arrived at their project by taking different roads into the arts. While Kim was completing her studies at the University of Michigan, Namburi was taking dance lessons as a hobby in Boston. He learned that both expressive and sustainable movements might share a common denominator. “A key insight was that elastic tissues play a crucial role in coordinated, expressive, and sustainable movements in dance — a principle that could extend beyond dancing,” he notes.
“We recognized that studying elastic tissues could shed light on reducing injury risk, as well as understanding musical expression and embodiment in the context of piano playing,” Kim says.
Kim and Namburi began collaborating on what would become their project in October 2023, though the groundwork was in place months before. “A visiting student working with me on a research project studying pianists in the MIT.nano Immersion Lab reached out to Mi-Eun in summer 2023,” Namburi recalls. A shared Instagram video showing their setup with motion capture sensors and a pianist playing Chopin on a digital keyboard sparked Kim’s interest. The Immersion Lab is an open-access, shared facility for MIT and beyond dedicated to visualizing, understanding, and interacting with large, multidimensional data.
“I couldn't make sense of all the sensors, but immediately noticed they were using a digital keyboard,” she says.
Kim wanted to elevate these studies’ quality by pairing the musicians with the proper equipment and instrument. While the digital pianos they’d previously used are portable and provide musical instrument digital interface (MIDI) data, they don’t offer the same experience as a real piano. “Pianists dream of playing on an ideal instrument — a 9-foot concert grand with perfectly regulated 24-inch keys that responds to every musical intention without resistance,” Kim says.
The researchers brought a Spirio grand piano to the Immersion Lab and observed that the instrument could both capture pianists’ hammerstrike velocities and reproduce them to play back the performance. Monitoring Kim’s performance on the concert grand piano, for example, both noted marked differences in her playing style.
“Despite all the sensors, lighting, and observers, playing felt so natural that I forgot I was in a lab,” she says. “I could focus purely on the music, without worrying about adapting to a smaller keyboard or digital sound.”
This setup allowed them to observe pianists’ natural movements, which was exactly what Kim wanted to study.
During Independent Activities Period 2025, Kim and Namburi hosted a new course, Biomechanics of Piano Playing, in the Immersion Lab. Students and faculty from MIT, Harvard University, the University of Michigan, the University of Toronto, and the University of Hartford took part. Participants learned how to use motion capture, accelerometers, and ultrasound imaging to visualize signals from the body during piano playing.
Observations and outcomes
If the efficiency and perceived fluency of an expert pianist’s movements comes from harnessing the body’s inherent elastic mechanisms, Kim and Namburi believe, it’s possible to redesign how piano playing is taught. Each wants to reduce occurrences of playing-related injuries and improve how musicians learn their craft.
“I want us to bridge the gap between artistic expression and biomechanical efficiency,” Namburi says.
Through their exploratory sessions at the Immersion Lab, Kim and Namburi found common ground, gathering information about their observations of and experiences in piano and dance through sensor technology, including ultrasound.
Beyond these, Kim saw potential for transforming piano pedagogy. “Traditional teaching relies heavily on subjective descriptions and metaphors passed down through generations,” she says. “While valuable, these approaches could be enhanced with objective, scientific understanding of the physical mechanisms behind skilled piano performance — evidence-driven piano pedagogy, if you will.”
Remembering Juanita Battle: “Everything about her was just happy”
MIT Health Student Health Plan Research and Resolution Specialist Juanita Battle passed away on Jan. 14. She was 70.
Battle was best known throughout the MIT community as one of the friendly faces and voices that students encountered whenever they had a question about their health insurance. For more than 17 years, Juanita was there to help students navigate the complexities of the U.S. health-care system.
“Juanita really cared about the students,” remembers Affiliate Health Plan Representative Lawanda Santiago. Whenever Battle was on a call with a student, you knew that call could take 20 minutes. “She would always go above and beyond.”
Sheila Sanchez, lead student health plan research and resolution specialist, agrees. “There was nothing she wouldn’t do to make sure that the student had a good experience when it came to some insurance question. She made sure that the student was always heard, always happy.”
“At the end of any conversation, she knew the student’s name, where they were from, what their mother’s name was, and even their favorite color,” says Sanchez.
“Juanita was the outward face of the MIT Student Health Insurance Plan,” adds David Tytell, MIT Health’s director of marketing and communications. “Whenever there was a call for volunteers to help promote student insurance, like Campus Preview Weekend, Juanita was always the first to raise her hand.” Her detailed, clear explanations of difficult insurance concepts were featured in multiple MIT Health videos.
“She also had a ‘crush’ on Tim the Beaver,” says Tytell. “She would instantly become a kid again whenever Tim entered the room, and she never missed an opportunity to take a selfie with him.”
Battle’s friends also recall her passion for dining out. “Juanita loved food! When we would go out to eat, Juanita would have the menu memorized before we even got there,” says Sanchez. "She had already done her research, read Yelp reviews, looked at pictures, figured out her top three favorite things, and even had recommendations for everybody else!”
“She especially loved tiramisu,” says Santiago.
Battle’s laugh was infectious. She was known for always looking at the bright side of things and had the uncanny ability to make a joke out of just about anything. Halloween was her favorite holiday, and she would always dress up and pose for pictures. “One of my last encounters with Juanita was last Halloween,” says Tytell. “I came back from a meeting to find a trick-or-treat bag filled with candy and a note from Juanita on my desk.”
“She didn’t let anything affect her attitude,” says Sanchez. “Everything about her was just happy.”
3Q: MIT’s Lonnie Petersen on the first medical X-ray taken in space
Many of us have gotten an X-ray at one time or another, either at the dentist’s or the doctor’s office. Now, astronauts orbiting Earth have shown it’s possible to take an X-ray in space. The effort will help future space explorers diagnose and monitor medical conditions, from fractures and sprains to signs of bone decalcification, while in orbit.
Last week, crew members aboard the Fram2 mission posted to social media and shared the first-ever medical X-ray image taken in space. The image is a black-and-white scan of a hand with a ring, echoing the very first X-ray image ever taken, 130 years ago, by the physicist Wilhelm Roentgen, of his wife’s hand. The new X-ray image was taken in microgravity, inside a four-person space capsule flying at orbital speeds of 17,500 miles per hour, about 200 miles above the Earth’s surface.
The in-flight body scan was part of the SpaceXray project, one of 22 science experiments that astronauts conducted during the Fram2 mission. Operated by SpaceX, Fram2 was the first human spaceflight mission to travel in a polar orbit, looping around the planet from pole to pole. Fram2 gets its mission name from the Norwegian ship “Fram,” which was the first to carry explorers to the Arctic and Antarctic regions in the late 19th century.
The body scans are a first demonstration that medical X-ray imaging can be done within the confines and conditions in space. Lonnie Petersen, a co-investigator on the SpaceXray project, is an associate professor in MIT’s Department of Aeronautics and Astronautics who studies space physiology and the effects of spaceflight on the human body. Petersen helped to define and design the protocol around the SpaceXray project, in collaboration with institutional partners such as Stanford University and the Mayo Clinic, and X-ray hardware companies KA and MinXray. Petersen talked with MIT News about how these first in-orbit X-ray images can help enable safe and healthy longer-term missions in space.
Q: What are the challenges in taking an X-ray in space, versus here on Earth?
A: There are several challenges regarding the hardware, methods, and subjects being X-rayed.
To get hardware certified for spaceflight, it should be miniaturized and as lightweight as possible. There are also increased safety requirements because all devices work in a confined space. The increased requirements drive technology development. I always say space is our best technology accelerator — this is also true for medical technology.
For this project we used a portable, specialized X-ray generator and detector developed by MinXray and KA Imaging for the battlefield and made it applicable for spaceflight.
In terms of methods, one of my concerns was that the increased background radiation might reduce the quality of the image so that it would fall below clinical standards. From the first images we have received from space, it seems that the quality is great. I am very excited to further analyze the full set of images.
We want the X-rays to travel straight through the body part of interest. This requires alignment of equipment and patient. As you can imagine, a floating subject will be harder to position. We will be quantifying any potential impact of this and using it for future updates.
We also do not have radiologists or technicians in space. The methods need to be simple and robust enough for a layperson to operate them.
And, finally, regarding subjects: Entry into space has huge impact on the human body. Blood and fluid are no longer pulled down toward the feet by gravity; they are evenly distributed, and thus there are regional pressures and perfusion changes. The cardiovascular system and the brain are impacted by this over time. Mechanical unloading of the body leads to muscle atrophy and bone decalcification as well reduction in exercise capacity. This mission was only 3.5 days, so the crew will likely not have experienced many negative effects, but with an X-ray, we can now monitor bone health in space. We have never been able to do that before. We can monitor potential fluid buildup in the lungs or check for diseases in the abdomen.
I’ll also take off my physician hat and put on my engineering hat: X-rays are a useful tool in nondestructive hardware tests in aviation (and other areas). This project increases our diagnostic capabilities in space, not just for patients, but also for hardware.
Q: How did the Fram2 crew do it?
A: The crew learned how to take X-rays in one afternoon. It was done as a train-the-trainer model. The protocol was created in advance and the crew took images of each other, checked the quality, and stored the images. We have only seen one image so far, but from that, we are very impressed with the quality, the skills, and the dedication to advancing science by the crew.
Q: What will you learn from these first images?
A: First and foremost, this was a technology demonstration: Can we even do this in space? We are looking forward to analyzing all the images, but from preliminary data it looks like we absolutely can. Now comes a detailed analysis to tease out all the lessons we possibly can learn from this both with regard to current capabilities but also the next steps. The team is, of course, very excited to carry this research forward and break even more ground.
Molecules that fight infection also act on the brain, inducing anxiety or sociability
Immune molecules called cytokines play important roles in the body’s defense against infection, helping to control inflammation and coordinating the responses of other immune cells. A growing body of evidence suggests that some of these molecules also influence the brain, leading to behavioral changes during illness.
Two new studies from MIT and Harvard Medical School, focused on a cytokine called IL-17, now add to that evidence. The researchers found that IL-17 acts on two distinct brain regions — the amygdala and the somatosensory cortex — to exert two divergent effects. In the amygdala, IL-17 can elicit feelings of anxiety, while in the cortex it promotes sociable behavior.
These findings suggest that the immune and nervous systems are tightly interconnected, says Gloria Choi, an associate professor of brain and cognitive sciences, a member of MIT’s Picower Institute for Learning and Memory, and one of the senior authors of the studies.
“If you’re sick, there’s so many more things that are happening to your internal states, your mood, and your behavioral states, and that’s not simply you being fatigued physically. It has something to do with the brain,” she says.
Jun Huh, an associate professor of immunology at Harvard Medical School, is also a senior author of both studies, which appear today in Cell. One of the papers was led by Picower Institute Research Scientist Byeongjun Lee and former Picower Institute research scientist Jeong-Tae Kwon, and the other was led by Harvard Medical School postdoc Yunjin Lee and Picower Institute postdoc Tomoe Ishikawa.
Behavioral effects
Choi and Huh became interested in IL-17 several years ago, when they found it was involved in a phenomenon known as the fever effect. Large-scale studies of autistic children have found that for many of them, their behavioral symptoms temporarily diminish when they have a fever.
In a 2019 study in mice, Choi and Huh showed that in some cases of infection, IL-17 is released and suppresses a small region of the brain’s cortex known as S1DZ. Overactivation of neurons in this region can lead to autism-like behavioral symptoms in mice, including repetitive behaviors and reduced sociability.
“This molecule became a link that connects immune system activation, manifested as a fever, to changes in brain function and changes in the animals’ behavior,” Choi says.
IL-17 comes in six different forms, and there are five different receptors that can bind to it. In their two new papers, the researchers set out to map which of these receptors are expressed in different parts of the brain. This mapping revealed that a pair of receptors known as IL-17RA and IL-17RB is found in the cortex, including in the S1DZ region that the researchers had previously identified. The receptors are located in a population of neurons that receive proprioceptive input and are involved in controlling behavior.
When a type of IL-17 known as IL-17E binds to these receptors, the neurons become less excitable, which leads to the behavioral effects seen in the 2019 study.
“IL-17E, which we’ve shown to be necessary for behavioral mitigation, actually does act almost exactly like a neuromodulator in that it will immediately reduce these neurons’ excitability,” Choi says. “So, there is an immune molecule that’s acting as a neuromodulator in the brain, and its main function is to regulate excitability of neurons.”
Choi hypothesizes that IL-17 may have originally evolved as a neuromodulator, and later on was appropriated by the immune system to play a role in promoting inflammation. That idea is consistent with previous work showing that in the worm C. elegans, IL-17 has no role in the immune system but instead acts on neurons. Among its effects in worms, IL-17 promotes aggregation, a form of social behavior. Additionally, in mammals, IL-17E is actually made by neurons in the cortex, including S1DZ.
“There’s a possibility that a couple of forms of IL-17 perhaps evolved first and foremost to act as a neuromodulator in the brain, and maybe later were hijacked by the immune system also to act as immune modulators,” Choi says.
Provoking anxiety
In the other Cell paper, the researchers explored another brain location where they found IL-17 receptors — the amygdala. This almond-shaped structure plays an important role in processing emotions, including fear and anxiety.
That study revealed that in a region known as the basolateral amygdala (BLA), the IL-17RA and IL-17RE receptors, which work as a pair, are expressed in a discrete population of neurons. When these receptors bind to IL-17A and IL-17C, the neurons become more excitable, leading to an increase in anxiety.
The researchers also found that, counterintuitively, if animals are treated with antibodies that block IL-17 receptors, it actually increases the amount of IL-17C circulating in the body. This finding may help to explain unexpected outcomes observed in a clinical trial of a drug targeting the IL-17-RA receptor for psoriasis treatment, particularly regarding its potential adverse effects on mental health.
“We hypothesize that there’s a possibility that the IL-17 ligand that is upregulated in this patient cohort might act on the brain to induce suicide ideation, while in animals there is an anxiogenic phenotype,” Choi says.
During infections, this anxiety may be a beneficial response, keeping the sick individual away from others to whom the infection could spread, Choi hypothesizes.
“Other than its main function of fighting pathogens, one of the ways that the immune system works is to control the host behavior, to protect the host itself and also protect the community the host belongs to,” she says. “One of the ways the immune system is doing that is to use cytokines, secreted factors, to go to the brain as communication tools.”
The researchers found that the same BLA neurons that have receptors for IL-17 also have receptors for IL-10, a cytokine that suppresses inflammation. This molecule counteracts the excitability generated by IL-17, giving the body a way to shut off anxiety once it’s no longer useful.
Distinctive behaviors
Together, the two studies suggest that the immune system, and even a single family of cytokines, can exert a variety of effects in the brain.
“We have now different combinations of IL-17 receptors being expressed in different populations of neurons, in two different brain regions, that regulate very distinct behaviors. One is actually somewhat positive and enhances social behaviors, and another is somewhat negative and induces anxiogenic phenotypes,” Choi says.
Her lab is now working on additional mapping of IL-17 receptor locations, as well as the IL-17 molecules that bind to them, focusing on the S1DZ region. Eventually, a better understanding of these neuro-immune interactions may help researchers develop new treatments for neurological conditions such as autism or depression.
“The fact that these molecules are made by the immune system gives us a novel approach to influence brain function as means of therapeutics,” Choi says. “Instead of thinking about directly going for the brain, can we think about doing something to the immune system?”
The research was funded, in part, by Jeongho Kim and the Brain Impact Foundation Neuro-Immune Fund, the Simons Foundation Autism Research Initiative, the Simons Center for the Social Brain, the Marcus Foundation, the N of One: Autism Research Foundation, the Burroughs Wellcome Fund, the Picower Institute Innovation Fund, the MIT John W. Jarve Seed Fund for Science Innovation, Young Soo Perry and Karen Ha, and the National Institutes of Health.
DIRNSA Fired
In “Secrets and Lies” (2000), I wrote:
It is poor civic hygiene to install technologies that could someday facilitate a police state.
It’s something a bunch of us were saying at the time, in reference to the vast NSA’s surveillance capabilities.
I have been thinking of that quote a lot as I read news stories of President Trump firing the Director of the National Security Agency. General Timothy Haugh.
A couple of weeks ago, I wrote:
We don’t know what pressure the Trump administration is using to make intelligence services fall into line, but it isn’t crazy to ...
NOAA halts upkeep of critical weather satellites
US-China LNG fight could scramble the energy transition
A fee on shipping emissions could be coming. Here are 5 things to watch.
Maryland delays penalties for noncompliance with clean car rules
Alaska youth ask court to stay ownership while they challenge LNG project
Florida’s Gulf Coast will soon get world’s largest artificial reef
UK launches $13M study on blocking sun’s heat
Brazil sees US tariffs damaging global climate efforts
Some EV drivers are doing it for the dogs
Silence among farmers
Nature Climate Change, Published online: 07 April 2025; doi:10.1038/s41558-025-02320-2
Silence among farmersHumans fuel stronger cyclones
Nature Climate Change, Published online: 07 April 2025; doi:10.1038/s41558-025-02321-1
Humans fuel stronger cyclonesAttributing soybean production shocks
Nature Climate Change, Published online: 07 April 2025; doi:10.1038/s41558-025-02319-9
Attributing soybean production shocksRegulation on conglomerates
Nature Climate Change, Published online: 07 April 2025; doi:10.1038/s41558-025-02322-0
Regulation on conglomeratesData under duress
Nature Climate Change, Published online: 07 April 2025; doi:10.1038/s41558-025-02323-z
Climate change and climate action are socially and politically divisive topics in many countries. In addition to contributing to political disparity, climate research is also affected by political context, with consequences not only for scientists but for society as well.