By Kate Carter, Senior Program Officer for Science, Engineering, and Technology
Two weeks before the 2024 presidential election, Academy members convened in Cambridge for a compelling discussion about trust in science that featured Naomi Oreskes and Sean Decatur in conversation with Holden Thorp. The event, titled Rebuilding Trust in Science: A Morton L. Mandel Conversation, featured in this Bulletin issue, took stock of the shifting terrain of science communication since the Academy’s Public Face of Science project. The panelists tackled a landscape marked by growing acceptance of climate change, deepening skepticism of the medical community, and the relentless churn of misinformation. They explored how scientists, journalists, institutional leaders, and others can cut through division and engage the public more meaningfully. With polarization and politicization eroding public trust in science, the stakes could not be higher. The event also featured Shirley Malcom and Cristine Russell, who bookended the discussion, as well as a vibrant exchange with the audience, leaving no doubt that rebuilding trust is urgent and complex.
On the following day, twenty-two participants, spanning the fields of science, technology, journalism, museum education, and law, attended the related exploratory meeting, Bridging the Gap Between Science and the Public, cochaired by Thorp and Russell. The meeting opened with participants facing an uncomfortable truth: Trust in science is not a universal concept. It depends on who you ask, what you are asking about, and where you are asking. The room buzzed with a shared understanding that the science engagement landscape had shifted—traditional media, once the gatekeeper of scientific knowledge, had given way to a cacophony of influencers, content creators, and conspiracy theorists. The challenge was not just about reaching people but cutting through the noise and making them care.
The conclusion was blunt: Transparency is nonnegotiable. “If science is going to regain trust, we need to show people how the sausage gets made,” Thorp argued, underscoring the importance of explaining how scientific conclusions evolve. But the participants also acknowledged the underlying paradox. While admitting uncertainty can build credibility, it is also a gamble—one misstep and the critics pounce, branding scientists as indecisive or, worse, deceitful.
COVID-19, AI, and Climate Change: Exploring Changes to the Engagement Landscape
The meeting cochairs and Science, Engineering, and Technology project staff selected three topics to explore how public trust in science and communication strategies have evolved since 2019: COVID-19, artificial intelligence (AI), and climate change. These areas were chosen for their global impact, their unique challenges to scientific messaging, and their role in shaping public perceptions of expertise. Together, they provide a lens to examine shifts in trust, the rise of misinformation, and the opportunities for constructive engagement.
When discussing COVID-19, the participants agreed on one point: trust, once broken, is hard to repair. The attacks on Dr. Anthony Fauci—symbolic of broader distrust in public health institutions—offered a cautionary tale. His transformation from trusted expert to political lightning rod illustrated how quickly credibility can erode when science becomes entangled with politics. Although the war metaphor “fighting misinformation” came up repeatedly, some participants pushed back. Wars are won with strategy and attacks, they argued, not just defense. The science community needs to think beyond debunking myths and anticipate the next wave of misinformation before it hits.
This need is amplified by AI’s ease of generating false or misleading information. One participant recounted a startling moment when they asked an AI chatbot for scientific advice. The response was polished, authoritative, and completely wrong. It is this veneer of credibility that makes AI so tricky—it does not just spread misinformation; it does so with unsettling confidence.
The participants wrestled with the implications of AI and emerging technologies. AI’s “black box” nature—its inability to explain how it arrives at conclusions—undermines a core tenet of science: reproducibility. While Europe has begun regulating these opaque models, the United States lags, raising questions about accountability and ethics. “How can we expect the public to trust AI when we do not fully understand it ourselves?” one attendee mused.
Unlike growing distrust of AI and COVID-19, climate change has, in most ways, changed for the better, with more people expressing belief in its realities. However, when the stakes are so existential, the participants agreed that action needs to be more robust. For years, scientists have been winning the battle of belief—more people accept the reality of climate change than ever before. But belief, as one participant put it, “does not put solar panels on roofs.”
They suggested that the problem is trust—or the lack of it. Government funding for climate initiatives often fails to reach the communities that need it most. “If people do not see the benefits of our work in their daily lives, why would they trust us?” one participant asked, voicing a sentiment that resonated throughout the meeting.
The solutions proposed were both pragmatic and ambitious. Universities could lead by example, adopting climate action plans that model accountability. Local organizations could harness social norms to inspire action—if your neighbor installs solar panels, you might feel compelled to follow suit. But beyond strategies and statistics, the group emphasized something less tangible: empathy. As one attendee stressed, “We need to stop talking at people and start listening to them.”
Working Toward Solutions
The discussions returned repeatedly to a fundamental challenge: meeting people where they are. Building trust is not just about sharing facts; it is about understanding the fears, values, and experiences that shape how those facts are received.
Representation emerged as a key theme. When scientists look like the communities they are trying to reach, barriers to trust are lifted. However, the group acknowledged that representation alone is not enough. Trust requires consistency. Too often, scientific outreach feels episodic—a one-off lecture or a fleeting campaign. They argued that sustained engagement is needed, the kind that builds relationships over months and years, not weeks.
Social media, for all its flaws, presents an opportunity. TikTok can reach younger audiences, and active participation from scientists can combat the platform’s steady stream of misinformation. The trick is tailoring the message without diluting its meaning. It is a delicate balance—simplifying without oversimplifying, engaging without pandering.
By the end of the meeting, there was no grand consensus—after all, how could such a complex issue lend itself to simple solutions? There was a shared conviction that the way forward requires more than better communication. It requires better listening, greater empathy, and a renewed willingness to adapt. The work ahead is not just about restoring trust in science but reimagining what that trust looks like in a fractured, fast-changing world. Social media and fragmented media ecosystems demand innovative approaches, not just updated messages. Institutions should invest in systems that support scientists, helping them engage more effectively without overburdening their already demanding roles.
To continue this work, in early 2025 the Academy is planning to host a series of roundtables with members to explore actionable steps the Academy can take to address these challenges. These discussions will provide a platform to deepen our understanding, gather diverse perspectives, and chart a path forward. Members interested in contributing to this critical dialogue are encouraged to reach out to Program Associate Jen Smith (jsmith@amacad.org) and participate in shaping the Academy’s role in this vital work.