Harnessing the power of philanthropy to protect our future.

Expert-led and research-driven, we devise bespoke strategies that maximise the impact of your giving.

How We Work

We work with donors in every stage of their giving journey, from brand new philanthropists to seasoned leaders of effective giving. Depending on your particular needs, we can introduce you to experts on the world’s largest and most neglected issues, bring you into our curated learning sessions on the principles of effective giving, or simply meet you in the trenches and share our current grant opportunities and the reasoning behind them. Everything we do is always completely free of charge or commission.

A sample donor experience:

  1. We introduce you to the principles of effective giving through an individual seminar that explores how giving today can have an outsized effect on the decades, centuries, and millennia to come. We tailor each session to your needs, but frequently cover: the role of philanthropy in moral progress, why we should be optimistic about our ability to make an enormous difference, our radically impartial and reason-driven approach to giving, how we prioritise between different causes, what solutions to global problems might look like, and how best to take action given the current philanthropic landscape.
  2. We introduce you to world-class experts in key cause areas who can answer any questions you have about their field. We are committed to clarity and transparency about our reasoning and research. While some of our philanthropists prefer to be more hands-off, you will be given the opportunity to know as much as we do about why we believe our grants are high-impact, and how that impact is achieved.
  3. At the conclusion of this learning process, which can last anywhere from days to months, we will create a bespoke portfolio of grant recommendations. See our grantmaking page for more details on our process and focus areas. If this portfolio is approved, we do everything necessary to facilitate the transfer of the grant, including due diligence and processing.

For established donors who are already on board with our approach:

  1. If you prefer to cut to the chase, we can immediately begin investigating grants that suit your needs. We share our recommendations and reasoning as soon as possible in written reports.
  2. We also invite donors to contribute directly to the Longview Philanthropy Fund, which allows us to make timely grants to the very best opportunities as they arise. All contributors to the Longview Philanthropy Fund are provided with reports every six months. These reports detail the grants we made during that period, our reasoning for those grants, and major updates to our grantmaking programmes.
As part of our commitment to future generations, many of our senior staff have signed the Giving What We Can Pledge – donating at least 10% of their income to the kinds of projects we recommend to our donors.
Natalie Cargill
Founder & President
Natalie works with philanthropists individually to plan and execute their giving plans.
Simran Dhaliwal
Simran coordinates Longview Philanthropy’s research, grantmaking, and advising work.
Kit Harris
Existential Risks Programme Officer
Kit leads grant investigations in artificial intelligence and biosecurity and lays the groundwork for new lines of work.
Dr Tyler John
Emerging Technology Governance Programme Officer
Tyler leads Longview Philanthropy’s grantmaking to support the effective governance of emerging technologies, especially artificial intelligence.
Page Hedley
Senior Programme Officer
Page is a Senior Programme Officer at Longview, with a particular focus on the governance of artificial intelligence.
Zach Freitas-Groff
Senior Programme Associate
Zach leads Longview’s grantmaking in artificial intelligence (AI) and long-term wellbeing.
Carl Robichaud
Nuclear Weapons Policy Programme Officer
Carl co-leads Longview’s programme on nuclear weapons policy.
Matthew Gentzel
Nuclear Weapons Policy Programme Officer
Matthew co-leads Longview’s programme on nuclear weapons policy.
Fin Moorhouse
Programme Associate
Fin conducts research to inform our grantmaking.
Gavin Weinberg
Chief of Staff
Gavin works closely with the CEO, focusing on organisational strategy and process.
Katie Hearsum
Katie oversees operations systems and events at Longview.
Andrew Player
Head of Grants Management & Compliance
Andrew manages Longview's grants system and ensures compliance with regulatory and legal requirements.
Ruth Wallis
Operations Associate
Ruth supports the team at Longview by ensuring the smooth running of internal systems.
Toby Jolly
Data Manager
Toby manages our CRM and other data at Longview.
Maddie Benderschi
Events Specialist
Maddie produces Longview events.
Hamish Hobbs
Policy Advisor to OECD Strategic Foresight Unit
Hamish is a policy advisor at Longview.
Liv Boeree
Liv is an Ambassador at Longview, communicating about topics and concepts central to Longview’s work.
Giving What We Can
Giving What We Can is a community of effective givers who have pledged to give at least 10% of their income to high-impact cause areas. Longview has partnered with Giving What We Can to create the Emerging Challenges Fund, which funds grants recommended by Longview with input from the Giving What We Can research team.
Read More
Introductions to other communities
Longview also introduces our donors to other philanthropic communities wherever valuable. We’re proud that, to date, we have connected three donors with the Giving Pledge, and those donors have gone on to become Giving Pledge signatories.
Read More
Neel Nanda
Neel leads the mechanistic interpretability team at Google DeepMind: taking a trained neural network and trying to reverse engineer the algorithms it learned.
Prior to this, he worked at Anthropic as a language model interpretability researcher. Neel holds a pure maths degree from Cambridge. He has worked in quantitative finance, as a researcher at the University of Oxford, and at the UC Berkeley Center for Human-Compatible AI.
Ajeya Cotra
Ajeya leads Open Philanthropy’s grantmaking on technical research that could help to clarify and reduce catastrophic risks from advanced AI.
As part of this role, Ajeya conducts analysis on threat models (ways that advanced AI could cause catastrophic harm) and technical agendas (technical work that may help to address these threat models). Previously, she worked on estimating when transformative AI might be developed, estimating the expected returns to funding across cause areas related to global catastrophic risk, and thinking about how worldview diversification could be implemented in budget allocation.
Charlotte Stix
Charlotte leads OpenAI’s policy engagements with the EU and is a fellow at the Leverhulme Centre for the Future of Intelligence, University of Cambridge.
Previously, Charlotte coordinated the European Commission's High-Level Expert Group on Artificial Intelligence, oversaw €18 million in projects, and contributed to the formulation of EU-wide AI strategy at the European Commission's Robotics and Artificial Intelligence Unit.
Prof. Kevin M. Esvelt
Kevin is an associate professor at MIT’s Media Lab, where his research lab focuses on advancing biotechnology safely.
Kevin helped pioneer the development of CRISPR, a powerful new method of genome engineering, and invented its use in gene drive, a way of causing engineered genes to spread through wild species. Recognizing the implications of this technology for the environment, he quickly became a public advocate for safety and transparency in the field of biotechnology to reduce the risk of accidents and misuse. Kevin now leads SecureBio, a nonprofit that incubates projects working to prevent and prepare for extreme pandemics.
Prof. Will MacAskill
Will is an associate professor in philosophy at the University of Oxford, author of the New York Times bestseller What We Owe the Future, and a co-founder of the Global Priorities Institute and three international nonprofits.
Will’s research focuses on the prioritisation of interventions to improve the long-term future and on the optimal timing of philanthropic action. Upon publication of his bestselling book Doing Good Better, Will was described by Bill Gates as “a data nerd after my own heart." Watch his TED talk here.
Rohin Shah
Rohin co-leads the Scalable Alignment team at Google DeepMind, which aims to improve AI safety by ensuring that as AI capabilities scale up, AI systems increasingly “try” to do what the designer intended for them to do.
Previously, he completed his PhD at the Center for Human-Compatible AI at UC Berkeley, where he worked on building AI systems that can learn to assist a human user, even if they don't initially know what the user wants.
Prof. Hilary Greaves
Hilary is a professor in philosophy at the University of Oxford and the former director of the Global Priorities Institute.
Hilary’s current research focuses on issues of global prioritisation, including evaluating the strength and robustness of the argument for longtermism, and the extent to which reliably influencing the long-run future is tractable given our radical uncertainty about the long-term effects of our actions.
Cate Hall
Cate Hall is a founder and Chief Operating Officer of Alvea, which set the record for the fastest start-up to take a new drug into phase 1 clinical trials.
She is also a founder of Juniper Ventures, a biosecurity and pandemic preparedness incubator. Previously, she was the #1 female poker player in the world. Before that, a full-time lawyer with a degree from Yale Law School.
Dr Andrew Snyder-Beattie
Andrew leads Open Philanthropy’s programme on biosecurity and pandemic preparedness, which has made grants totalling over $100 million.
Previously, Andrew was the director of research at the University of Oxford’s Future of Humanity Institute. He holds a PhD from the University of Oxford. His dissertation dealt with issues around existential risk, with publications in Nature Scientific Reports and Astrobiology.
Dan Hendrycks
Director of the Center for AI Safety
Dan is the Founder and Director of the Center for AI Safety.
He received his PhD from UC Berkeley, where he graduated with nearly 10,000 citations, advised by Dawn Song and Jacob Steinhardt. Dan helped contribute the GELU activation function (the most-used activation in state-of-the-art models including BERT, GPT, Vision Transformers, etc.), the out-of-distribution detection baseline, and distribution shift benchmarks. He was named as one of the Time's 100 most influential people in AI in 2023, and advises xAI on their safety.


Our operational costs are fully funded by the below group of philanthropists. We are deeply grateful for their support – it is what allows us to provide all of our funding recommendations, research and work free-of-charge. While our funders share our mission, they have no influence over any aspect of our work, including our grant recommendations.

Current funders include Martin Crowley, Tom Crowley, Likith Govindaiah, Justin Rockefeller, Rafael Albert and several private philanthropists and foundations. Historical funders include Ben Delo.

These people began warning us of the risk posed by pandemics well before anyone had ever heard of COVID. That risk is higher now than it ever has been. To prevent the next pandemic - which could be far more devastating than any we have yet experienced - philanthropists should pay close attention to what the team at Longview has to say.
Michael Specter
The New Yorker, author of Denialism
Where you put your money as a philanthropist matters, and Longview Philanthropy is excellent at helping donors accomplish as much good as possible with their giving.
Holden Karnofsky
Co-CEO of Open Philanthropy and Co-Founder of GiveWell
Longview focuses its deep expertise on bold, pragmatic efforts to reduce catastrophic biological and nuclear weapons threats. The result is high impact.
Hon. Andy Weber
Former U.S. Assistant Secretary of Defense for Nuclear, Chemical and Biological Defense Programs
Longview Philanthropy is the best resource in the world if you want to do the most good with your giving.
William MacAskill
William MacAskill, Professor in Philosophy at the University of Oxford and author of What We Owe the Future
Fascinating exposure to bright minds, intensive learning on massive challenges and opportunity for rigorous thinking about how to make the world a better place.
Stacey Kline
CEO of Otto Intelligence
If all goes well, humanity has a vast future ahead of it — but very little of our philanthropy takes the scale of this future seriously. That’s why I am so excited about Longview Philanthropy. They really get it, and are finding opportunities for lasting impact.
Toby Ord
Senior Research Fellow in Philosophy at the University of Oxford & author of The Precipice