Emerging Challenges Fund

About the ECF

Through the Emerging Challenges Fund, we offer anyone the opportunity to contribute to a pooled, thesis-driven fund that our expert grantmaking teams will direct to outstanding organizations where additional funding can quickly make a major difference.

Over the next decade, emerging technologies will pose significant challenges to global security. Rapid advances in artificial intelligence could create advanced AI systems with goals that diverge from human interests and grant authoritarians unprecedented means of control. We face rising nuclear and biological risks as advances accelerate and automate elements of nuclear decision-making and lower the barriers for malicious actors to execute large-scale biological attacks.

We aim to prepare the world for these challenges. In selecting projects, the ECF considers Longview’s usual grantmaking criteria and two further tests:

  1. Does the project have a legible theory of impact? ECF grantees must have a compelling, transparent, and public case for how their activities will have an impact that appeals to a wide range of donors.
  2. Will the project benefit from diverse funding? Policy organizations sometimes benefit from the support of the ECF’s 2000+ donors when demonstrating their independence from major funders and industry actors. ECF grantees often, though not always, pass this test.

In 2025, ECF donors supported organizations advancing both policy and research. On the policy side, grantees worked to shape frontier AI governance in the US and Europe, including by building government capacity through talent pipelines and facilitating discussions on AI and arms control between the US and China. On the research side, we funded groups evaluating AI system capabilities, their potential misuse by malicious actors, and the broader societal implications of rapid AI progress.

For those seeking to invest in a safer future this fund provides unique expertise across beneficial AI, biosecurity, and nuclear weapons policy and fills critical funding gaps at organisations in need of rapid financial support and a diversity of donors.

For major donors

Longview’s focus, and the source of most of our impact, is helping major donors give.

  1. Give to our private funds. For donors giving over $100K, we offer access to our private frontier AI, digital sentience, and nuclear weapons policy funds. Our private fund reports are sent directly to donors rather than distributed publicly, allowing us to use those funds to support confidential, risky, or large-scale projects.
  2. Get bespoke advice. For major donors seeking to develop significant philanthropic portfolios, we provide a personalized end-to-end service at no cost. This includes detailed analysis, expert-led learning series, residential summits, tailored strategic planning, grant recommendations, due diligence, and impact assessment.

Please get in touch with our CEO, Simran Dhaliwal, at simran@longview.org.

Fund Managers
Emerging Challenges Fund
Simran Dhaliwal
CEO
Simran coordinates Longview Philanthropy’s research, grantmaking, and advising work. Prior to joining, she was a research analyst at Goldman Sachs, working on a two-person team recognised as the best sell-side stockpickers in London in 2018. While there, she also became a Chartered Financial Analyst (CFA) charterholder and was donating to high-impact charities. Simran read philosophy, politics, and economics at the University of Oxford, where she first came across the concept of using evidence and reason to do the most good at a Giving What We Can talk.
Emerging Challenges Fund
Carl Robichaud
Nuclear Weapons Policy Programme Director
Carl leads Longview’s programme on nuclear weapons policy and co-manages Longview’s Nuclear Weapons Policy Fund. For more than a decade, Carl led grantmaking in nuclear security at the Carnegie Corporation of New York, a philanthropic fund which grants over $30 million annually to strengthen international peace and security. Carl previously worked with The Century Foundation and the Global Security Institute, where his extensive research spanned arms control, international security policy, and nonproliferation.
Emerging Challenges Fund
Matthew Gentzel
Nuclear Weapons Policy Programme Officer
Matthew conducts grant investigations for Longview’s programme on nuclear weapons policy and co-manages its Nuclear Weapons Policy Fund. His prior work spanned emerging technology threat and policy assessment, focusing on how advancements in AI may shape influence operations, nuclear strategy, and cyber attacks. He has worked as a policy researcher with OpenAI, an analyst in the US Department of Defense’s Innovation Steering Group, and director of research and analysis at the US National Security Commission on Artificial Intelligence.
Emerging Challenges Fund
Aidan O’Gara
AI Programme Officer
 
Aidan conducts grant investigations in artificial intelligence (AI), with a particular focus in technical research on AI safety. Before joining Longview, he conducted research on machine learning and AI policy at GovAI, Epoch, Cornell University, AI Impacts, and the Center for AI Safety. He also spent three years leading the data science team at a fintech startup. Alongside his work at Longview, Aidan is a DPhil candidate in AI at Oxford University.
 
Emerging Challenges Fund
Dr Zach Freitas-Groff
Senior Programme Associate
 
Zach conducts grant investigations in artificial intelligence (AI). He completed his PhD in economics at Stanford University, where he received support from the National Science Foundation, the Forethought Foundation for Global Priorities Research, and the Stanford Institute for Economic Policy Research. Zach has conducted research covered by The New York Times, Reuters, Marginal Revolution, and Vox. Before that, he was a Research Analyst at Innovations for Poverty Action and the Global Poverty Research Lab at Northwestern University.