About

Harnessing the power of philanthropy to protect our future.

Expert-led and research-driven, we devise bespoke strategies that maximise the impact of your giving.

How We Work

We work with donors in every stage of their giving journey, from brand new philanthropists to seasoned leaders of effective giving. Depending on your particular needs, we can introduce you to experts on the world’s largest and most neglected issues, bring you into our curated learning sessions on the principles of effective giving, or simply meet you in the trenches and share our current grant opportunities and the reasoning behind them. Everything we do is always completely free of charge or commission.

A sample donor experience:

  1. We introduce you to the principles of effective giving through an individual seminar that explores how giving today can have an outsized effect on the decades, centuries, and millennia to come. We tailor each session to your needs, but frequently cover: the role of philanthropy in moral progress, why we should be optimistic about our ability to make an enormous difference, our radically impartial and reason-driven approach to giving, how we prioritise between different causes, what solutions to global problems might look like, and how best to take action given the current philanthropic landscape.
  2. We introduce you to world-class experts in key cause areas who can answer any questions you have about their field. We are committed to clarity and transparency about our reasoning and research. While some of our philanthropists prefer to be more hands-off, you will be given the opportunity to know as much as we do about why we believe our grants are high-impact, and how that impact is achieved.
  3. At the conclusion of this learning process, which can last anywhere from days to months, we will create a bespoke portfolio of grant recommendations. See our grantmaking page for more details on our process and focus areas. If this portfolio is approved, we do everything necessary to facilitate the transfer of the grant, including due diligence and processing.

For established donors who are already on board with our approach:

  1. If you prefer to cut to the chase, we can immediately begin investigating grants that suit your needs. We share our recommendations and reasoning as soon as possible in written reports.
  2. We also invite donors to contribute directly to the Longview Philanthropy Fund, which allows us to make timely grants to the very best opportunities as they arise. All contributors to the Longview Philanthropy Fund are provided with reports every six months. These reports detail the grants we made during that period, our reasoning for those grants, and major updates to our grantmaking programmes.
Team
As part of our commitment to future generations, many of our senior staff have signed the Giving What We Can Pledge – donating at least 10% of their income to the kinds of projects we recommend to our donors.
About
Natalie Cargill
Founder & President
Natalie works with philanthropists individually to plan and execute their giving plans.
About
Simran Dhaliwal
CEO
Simran coordinates Longview Philanthropy’s research, grantmaking, and advising work.
About
Gavin Weinberg
Chief of Staff
Gavin works closely with the CEO, focusing on organisational strategy and process.
About
Katie Hearsum
COO
Katie oversees operations systems and events at Longview.
About
Carl Robichaud
Nuclear Weapons Policy Programme Director
Carl leads Longview’s programme on nuclear weapons policy.
About
Matthew Gentzel
Nuclear Weapons Policy Programme Officer
Matthew conducts grant investigations for Longview’s programme on nuclear weapons policy.
About
Page Hedley
AI Programme Director
Page is an AI Programme Director at Longview.
About
Aidan O'Gara
AI Programme Officer
Aidan conducts grant investigations in artificial intelligence (AI), with a particular focus in technical research on AI safety.
About
Dr Zach Freitas-Groff
Senior Programme Associate
Zach conducts grant investigations in artificial intelligence (AI).
About
Suryansh Mehta
Research Communications Officer
Suryansh specialises in communicating cutting-edge research on emerging technologies to Longview's advisees.
About
Andrew Player
Head of Grants Management & Compliance
Andrew manages Longview's grants system and ensures compliance with regulatory and legal requirements.
About
Ann Buffington
HR Director
Ann supports human resources at Longview.
About
Ruth Wallis
Operations Associate
Ruth supports the team at Longview by ensuring the smooth running of internal systems.
About
Meredith Wald
Events Director
Meredith is jointly responsible for Longview’s event strategy and production.
About
Claudia Leimgruber
Events Director
Claudia is jointly responsible for Longview’s event strategy and production.
Partners
Giving What We Can
Giving What We Can is a community of effective givers who have pledged to give at least 10% of their income to high-impact cause areas. Longview has partnered with Giving What We Can to create the Emerging Challenges Fund, which funds grants recommended by Longview with input from the Giving What We Can research team.
Read More
Introductions to other communities
Longview also introduces our donors to other philanthropic communities wherever valuable. We’re proud that, to date, we have connected three donors with the Giving Pledge, and those donors have gone on to become Giving Pledge signatories.
Read More
Advisors
About
Neel Nanda
RESEARCH ENGINEER AT GOOGLE DEEPMIND
Neel leads the mechanistic interpretability team at Google DeepMind: taking a trained neural network and trying to reverse engineer the algorithms it learned.
Prior to this, he worked at Anthropic as a language model interpretability researcher. Neel holds a pure maths degree from Cambridge. He has worked in quantitative finance, as a researcher at the University of Oxford, and at the UC Berkeley Center for Human-Compatible AI.
About
Ajeya Cotra
SENIOR PROGRAM OFFICER, POTENTIAL RISKS FROM ADVANCED AI
Ajeya leads Open Philanthropy’s grantmaking on technical research that could help to clarify and reduce catastrophic risks from advanced AI.
As part of this role, Ajeya conducts analysis on threat models (ways that advanced AI could cause catastrophic harm) and technical agendas (technical work that may help to address these threat models). Previously, she worked on estimating when transformative AI might be developed, estimating the expected returns to funding across cause areas related to global catastrophic risk, and thinking about how worldview diversification could be implemented in budget allocation.
About
Charlotte Stix
FELLOW AT THE LEVERHULME CENTRE FOR THE FUTURE OF INTELLIGENCE AT CAMBRIDGE UNIVERSITY
Charlotte leads OpenAI’s policy engagements with the EU and is a fellow at the Leverhulme Centre for the Future of Intelligence, University of Cambridge.
Previously, Charlotte coordinated the European Commission's High-Level Expert Group on Artificial Intelligence, oversaw €18 million in projects, and contributed to the formulation of EU-wide AI strategy at the European Commission's Robotics and Artificial Intelligence Unit.
About
Prof. Kevin M. Esvelt
BIOTECHNOLOGY PROFESSOR AT MIT AND LEADER OF SECUREBIO
Kevin is an associate professor at MIT’s Media Lab, where his research lab focuses on advancing biotechnology safely.
Kevin helped pioneer the development of CRISPR, a powerful new method of genome engineering, and invented its use in gene drive, a way of causing engineered genes to spread through wild species. Recognizing the implications of this technology for the environment, he quickly became a public advocate for safety and transparency in the field of biotechnology to reduce the risk of accidents and misuse. Kevin now leads SecureBio, a nonprofit that incubates projects working to prevent and prepare for extreme pandemics.
About
Prof. Will MacAskill
PROFESSOR IN PHILOSOPHY AT OXFORD UNIVERSITY
Will is an associate professor in philosophy at the University of Oxford, author of the New York Times bestseller What We Owe the Future, and a co-founder of the Global Priorities Institute and three international nonprofits.
Will’s research focuses on the prioritisation of interventions to improve the long-term future and on the optimal timing of philanthropic action. Upon publication of his bestselling book Doing Good Better, Will was described by Bill Gates as “a data nerd after my own heart." Watch his TED talk here.
About
Rohin Shah
RESEARCH SCIENTIST AT GOOGLE DEEPMIND
Rohin co-leads the Scalable Alignment team at Google DeepMind, which aims to improve AI safety by ensuring that as AI capabilities scale up, AI systems increasingly “try” to do what the designer intended for them to do.
Previously, he completed his PhD at the Center for Human-Compatible AI at UC Berkeley, where he worked on building AI systems that can learn to assist a human user, even if they don't initially know what the user wants.
About
Prof. Hilary Greaves
PROFESSOR IN PHILOSOPHY AT OXFORD UNIVERSITY
Hilary is a professor in philosophy at the University of Oxford and the former director of the Global Priorities Institute.
Hilary’s current research focuses on issues of global prioritisation, including evaluating the strength and robustness of the argument for longtermism, and the extent to which reliably influencing the long-run future is tractable given our radical uncertainty about the long-term effects of our actions.
About
Cate Hall
FOUNDER AND COO AT ALVEA
Cate Hall is a founder and Chief Operating Officer of Alvea, which set the record for the fastest start-up to take a new drug into phase 1 clinical trials.
She is also a founder of Juniper Ventures, a biosecurity and pandemic preparedness incubator. Previously, she was the #1 female poker player in the world. Before that, a full-time lawyer with a degree from Yale Law School.
About
Dr Andrew Snyder-Beattie
PROGRAM OFFICER AT OPEN PHILANTHROPY
Andrew leads Open Philanthropy’s programme on biosecurity and pandemic preparedness, which has made grants totalling over $100 million.
Previously, Andrew was the director of research at the University of Oxford’s Future of Humanity Institute. He holds a PhD from the University of Oxford. His dissertation dealt with issues around existential risk, with publications in Nature Scientific Reports and Astrobiology.
About
Dan Hendrycks
Director of the Center for AI Safety
Dan is the Founder and Director of the Center for AI Safety.
He received his PhD from UC Berkeley, where he graduated with nearly 10,000 citations, advised by Dawn Song and Jacob Steinhardt. Dan helped contribute the GELU activation function (the most-used activation in state-of-the-art models including BERT, GPT, Vision Transformers, etc.), the out-of-distribution detection baseline, and distribution shift benchmarks. He was named as one of the Time's 100 most influential people in AI in 2023, and advises xAI on their safety.
About
Kit Harris
Chief of Staff at METR
Kit is Chief of Staff at METR, a research nonprofit that works on assessing whether cutting-edge AI systems could pose catastrophic risks to society.
Prior to this, Kit led grant investigations in artificial intelligence and biosecurity and laid the groundwork for new lines of work as a Programme Officer at Longview Philanthropy. Earlier in his career, Kit worked as a credit derivatives trader with J.P. Morgan. During that time, he donated the majority of his income to high-impact charities.

Supporters

Our operational costs are fully funded by the below group of philanthropists. We are deeply grateful for their support – it is what allows us to provide all of our funding recommendations, research and work free-of-charge. While our funders share our mission, they have no influence over any aspect of our work, including our grant recommendations.

Current funders include Open Philanthropy, Martin Crowley, Tom Crowley, Likith Govindaiah, Justin Rockefeller, Rafael Albert and several private philanthropists and foundations. Historical funders include Ben Delo.

These people began warning us of the risk posed by pandemics well before anyone had ever heard of COVID. That risk is higher now than it ever has been. To prevent the next pandemic - which could be far more devastating than any we have yet experienced - philanthropists should pay close attention to what the team at Longview has to say.
Michael Specter
The New Yorker, author of Denialism
Where you put your money as a philanthropist matters, and Longview Philanthropy is excellent at helping donors accomplish as much good as possible with their giving.
Holden Karnofsky
Co-CEO of Open Philanthropy and Co-Founder of GiveWell
Longview focuses its deep expertise on bold, pragmatic efforts to reduce catastrophic biological and nuclear weapons threats. The result is high impact.
Hon. Andy Weber
Former U.S. Assistant Secretary of Defense for Nuclear, Chemical and Biological Defense Programs
Longview Philanthropy is the best resource in the world if you want to do the most good with your giving.
William MacAskill
William MacAskill, Professor in Philosophy at the University of Oxford and author of What We Owe the Future
Fascinating exposure to bright minds, intensive learning on massive challenges and opportunity for rigorous thinking about how to make the world a better place.
Stacey Kline
CEO of Otto Intelligence
If all goes well, humanity has a vast future ahead of it — but very little of our philanthropy takes the scale of this future seriously. That’s why I am so excited about Longview Philanthropy. They really get it, and are finding opportunities for lasting impact.
Toby Ord
Senior Research Fellow in Philosophy at the University of Oxford & author of The Precipice