Catastrophic Pandemics - Full Report
Report depth
Basic/Intermediate.
Note: This research draws heavily from a post by Arden Koehler and Benjamin Hilton at 80,000 Hours. It may not reflect their current research; see their up-to-date entry here.
Summary
Why focus on this problem?
Imagine a disease as deadly as ebola but as common as the flu. Could such an awful virus have ever existed?
The answer, history tells, is yes.
The black death killed one-third of all Europeans alive at the time.
The 1918 flu claimed more lives than everyone who died on the battlefields of World War I.
These were some of the worst catastrophes in history. But future pandemics could be even worse now that gain-of-function research and DNA sequencing technology allow malicious actors to “play God” and engineer even worse diseases.
In these dark times, Christians working in biosecurity can bring hope, healing, and light. We know that God is the ultimate healer. And He calls Christians, past, present, and future, to witness to Him by healing others, including those at risk of a catastrophic pandemic.
Our Overall View
Sometimes recommend. We think some of our readers should consider working on and/or donating to this issue.
What is our recommendation based on?
Biblical themes:
There are over 200 bible verses expressing healing and concern for the sick (e.g. Proverbs 14:31, Psalm 147:3, Jeremiah 30:17)
Christ is frequently portrayed as a healer (e.g. Matt 12:25, Matt 14:26, Luke 4:40, Acts 10:38, Mark 6:5)
The eschaton has no sadness nor death (e.g. Isaiah 65:17-20 Rev 21:4)
Christian tradition:
There is a rich history of Christian institutions caring for and healing the sick.
Many exemplary Christians took care of the sick.
Strong secular evidence:
This problem area is a focus of the University of Oxford’s Future of Humanity Institute, a research institute for understanding potentially catastrophic risks to the longer-term future.
Top Ways To Make An Impact
Work at an agency or think tank doing biosecurity research and policy recommendations.
Pursue one of these paths for a career in biosecurity policy.
Enter the bureaucracy. By working at FDA or CDC during a pandemic, you could help review key drug or vaccine applications or help roll out timely public health recommendations.
Develop testing, therapeutics, and other technologies and platforms that could be used to quickly test, vaccinate, and treat in the case of a large-scale outbreak. More on this here.
How Important, Neglected, and Solvable Are Catastrophic Pandemics?
Scale
We think that pandemics, especially engineered ones, are one of the greatest risks facing humanity today. A future pandemic could easily cause millions of deaths and large-scale injustice. There is a small but grave chance of a biological existential catastrophe within the next century.1
Neglectedness
Only several billion dollars are spent yearly on preventing pandemics. Very little of the spending is targeted at preventing biological risks that could be existential. Our impression is that this field is under-discussed by Christians.
Solvability
Solutions in the biosecurity space revolve around optimizing measures for pandemic prevention, detection, and response. Scientists and policymakers are working to create conditions in which it is difficult for pandemics to emerge. New technologies can better monitor for pandemics (e.g. biosurveillance) and better planning can help respond us respond more effectively to the next pandemic.
Advances In Biotechnology May Pose Catastrophic Risks
COVID-19 has highlighted our vulnerability to worldwide pandemics and revealed weaknesses in our ability to respond in a coordinated and sophisticated way. And historical events like the Black Death and the 1918 flu show that pandemics can be some of the most damaging disasters for humanity.
It is sobering to imagine the potential impact of a pandemic pathogen that is much more contagious than any we’ve seen so far, more deadly, or both.
Unfortunately, the emergence of such a pathogen is not out of the question, particularly in light of recent advances in biotechnology, which have allowed researchers to design and create biological agents much more easily and precisely than was possible before. If the field continues to advance along this trend, over the coming decades it may become possible for someone to create a pathogen that has been engineered to be substantially more contagious than natural pathogens, more deadly, and/or more difficult to address with standard countermeasures.3
At the same time, it may become easier for states or malicious individuals to access these pathogens, and potentially use them as weapons, because the relevant technologies are also becoming more widely available and easier to use.4
Dangerous pathogens engineered for research purposes could also be released accidentally through a failure of lab safety.5
Either scenario could result in a catastrophic ‘engineered pandemic.’ Although making a pathogen as dangerous as possible will not generally be in the interest of states or other actors (in part because it would likely threaten their own forces), a purposefully engineered pandemic pathogen does have the potential to be significantly more deadly and spreadable. Possibilities of accidents, recklessness, and unusual malice suggest we can’t rule out the prospect of a pandemic pathogen being released that could kill a large percentage of the population.
How likely we are to face such a pathogen is a matter of debate. But over the next century, the likelihood doesn’t seem negligible.6
Could an engineered pandemic pose an existential threat to humanity? Again, there is a reasonable debate here. In the past, societies have recovered from pandemics as severe as the Black Death, which killed around one-third to one-half of Europeans.7 But from what we’ve seen, the future GCBRs look like some of the larger contributors to existential risk this century.8
Reducing the risk of biological catastrophes by reducing the chances of potential outbreaks or preparing to mitigate their worst effects, therefore, seems very important.9
Clear Actions We Can Take To Reduce These Risks
Work with government, academia, and industry to improve the governance of gain-of-function research involving potential pandemic pathogens, commercial DNA synthesis, and other research and industries that may enable the creation of (or expand access to) particularly dangerous engineered pathogens. At times this may involve careful regulation.
Strengthen international commitments to not develop or deploy biological weapons, e.g. the Biological Weapons Convention.10
Develop broad-spectrum testing, therapeutics, and other technologies and platforms that could be used to quickly test, vaccinate, and treat billions of people in the case of a large-scale, novel outbreak.11
Learn more about these ideas and others in 80,000 Hours interview with Dr. Cassidy Nelson.
Most existing work is not aimed at reducing the risks of the worst outcomes.
The broader field of biosecurity and pandemic preparedness has made major contributions to GCBR reduction. Many of the best ways to prepare for more probable but less severe outbreaks will also reduce GCBRs, so many people who are not concerned with GCBRs in particular still do work that is useful for reducing them. For this reason, advancing parts of the broader field — especially in areas like vaccine research or broad-spectrum treatments — can be very valuable, even from the perspective of just trying to reduce the chances or severity of the worst potential outbreaks.
There may be even more valuable opportunities. It seems to be relatively uncommon for people in the broader field of biosecurity and pandemic preparedness to aim their work specifically at reducing GCBRs. Projects that disproportionately reduce GCBRs also seem to receive a relatively small proportion of health security funding.12
Moreover, insofar as more targeted interventions would be useful, the fact that there is comparatively little work targeted toward reducing GCBRs right now suggests that the area is somewhat neglected. This means that if you enter the field of biosecurity and pandemic preparedness aiming to reduce GCBRs, there may be particularly good opportunities to do so that others have not already pursued.
If you do enter the field aiming to reduce GCBRs, it might be easier to work on broader efforts that have more mainstream support first, and then transition to more targeted projects later.
If you are already working in biosecurity and pandemic preparedness (or a related field), this might be a good time to advocate for a greater focus on measures likely to help us with whatever outbreak surprises us next. There may be a greater openness to ideas in this area now, as people reflect on how underprepared we were for COVID-19.14
What Kinds Of Work Are Most Needed?
Biosecurity and pandemic preparedness are multidisciplinary fields. To address these threats effectively, we need at least:
Technical and biological researchers to investigate and develop tools for controlling outbreaks, such as broad-spectrum testing and antivirals. (See examples of research questions.)
Strategic researchers and forecasters to develop plans, such as for how to develop or scale up vaccines quickly.
People in government to pass and implement policies aimed at reducing biological threats.
It’s also important to remember that this area involves information hazards, making it essential for people who can act with discretion to fill these roles. This also means that information security experts may be especially helpful in this area.
What Jobs Are Available?
There are many organisations and agencies that work on reducing biological threats. Here are some that work specifically on GCBRs:
The Center for Health Security (CHS) received a $16 million grant from Open Philanthropy, who see CHS “as the preeminent U.S. think tank doing policy research and development in the biosecurity and pandemic preparedness (BPP) space.”
The Future of Humanity Institute (FHI) at Oxford University conducts multidisciplinary research on how to ensure a positive long-run future. With the recent hire of Piers Millett, FHI is looking to expand its research and policy functions to reduce catastrophic risks from biotechnology.
The Center for International Security and Cooperation has a biosecurity programme headed by Megan Palmer and was funded by Open Philanthropy.
The Nuclear Threat Initiative is a US non-partisan think tank that works to prevent catastrophic attacks and accidents with nuclear, biological, radiological, chemical, and cyber weapons of mass destruction and disruption. See current vacancies.
Intelligence Advanced Research Projects Activity (IARPA) is a government agency that funds research relevant to the US intelligence community. It has sponsored research on how to improve biosecurity and pandemic preparedness.
The Cambridge Centre for the Study of Existential Risk at Cambridge University houses academics studying both technical and strategic questions related to biosecurity. See current vacancies.
Bipartisan Commission on Biodefense is a group that analyses the United States defence capabilities against biological threats, and recommends and lobbies for improvements.
Global Catastrophic Risk Institute is an independent research institute that investigates how to minimise the risks of large-scale catastrophes. See current vacancies.
For opportunities in biotech policy, see this list of opportunities.
You can find particular jobs related to reducing biological threats at these organisations and others on 80,000 Hour’s job board:
Also, see a list of recommended organisations in biorisk reduction.
Want To Support Work In This Area By Donating?
You can also help by donating to well-run organisations that are making important progress on this issue.*
Which organisations should you give to? We haven’t investigated this question ourselves, but some advisors working on biosecurity and pandemic preparedness suggested that two organisations could be particularly good giving opportunities:
Open Philanthropy, a foundation that broadly shares some of our values, has made grants to both of these organisations based on assessments of the quality of their work, staff, and structure; their global influence; and how they are likely to use the grant.
However, both organisations might still have room for more funding.15 You can help fill any gaps by ‘topping up’ Open Philanthropy’s grants with your own donations.
To learn more, read Open Philanthropy’s assessment of these organisations and their specific justifications for making NTI a grant of $6 million and CHS a grant of $16 million, both in 2017.
*Our level of certainty regarding the impact of donations to these charities is substantially lower than for our top charities global poverty and health.
Who Are Some Christians In The CFI Network Working On This?
Adam Winnifrith, a master’s candidate at Oxford University, working on anti-microbial resistance.
Ana Karina Pitol, a research associate at the Liverpool School of Tropical Medicine.
Joan Gass, working at a biosecurity-related startup in Washington D.C.
Katarina Watney, recipient of an Open Philanthropy Tech Policy Fellowship in Washington D.C.
Caleb Watney, cofounder of the Institute for Progress.
Want to work on reducing the risks of the worst biological disasters? We want to help.
Sign up for 1-on-1 mentorship. We’ll pair you with a Christian who can talk to you about how to make an impact in this problem area.
References (from 80,000 Hour’s profile)
For comparisons of the deadliest events in history, see Luke Muehlhauser’s survey of the deadliest events in history, also cited in our full profile on reducing global catastrophic biological risks. Note three of the top 10 are pandemics:
↩
We’ve seen a variety of estimates regarding the chances of an existential biological catastrophe:
Ord, The Precipice (2020): 3% by 2120
Sandberg and Bostrom, Global Catastrophic Risks Survey (2008): 2% by 2100
Pamlin and Armstrong, Global Challenges: 12 Risks that Threaten Human Civilisation (2015): 0.0002% by 2115
Fodor, Critical Review of ‘The Precipice’ (2020): 0.0002% by 2120
Millet and Snyder-Beattie, Existential risk and cost-effective biosecurity (2017): 0.00019% (from biowarfare or bioterrorism) per year (assuming this is constant, this is equivalent to 0.02% by 2120).
We’ve looked at the reasoning behind these estimates and are uncertain about which ones we should most believe. Overall, we think the risk is around 0.1%, and very likely to be greater than 0.01%, but we haven’t thought about this in detail.↩
Gene sequencing, editing, and synthesis are all now possible and are becoming increasingly systematised, making it feasible in principle to engineer and produce biological agents in a way not too dissimilar to how we design and produce computers or other products. This may allow people to design and create pathogens that combine properties of natural pathogens or ones with wholly new features. (Read more)
Scientists are also investigating what makes pathogens more or less deadly and contagious. This improved understanding may help us better prevent and mitigate outbreaks. But it also means that the information required to design more dangerous pathogens is increasingly available.
All the technologies involved here have important medical uses, in addition to hazards. For example, gene sequencing technology may also be essential to help us quickly diagnose new diseases. (See an example of such an advance.) Properly handling these advances may therefore involve a delicate balancing act.↩In past decades, genetic engineering could be described as a ‘craft’ that involved a lot of uncertainty, tacit knowledge, and trial-and-error. The ambition of some synthetic biologists (1, 2, 3, and ‘BioBricks‘) has been to make this process more systematic and modular, which would allow more people with less extensive experience to create biological material reliably and economically — more like how we manufacture other products.
There has been steady progress on this front. Innovations in the last decades have made it easier to design and manufacture genetic material. Commercial synthesis is increasingly available and economical. Increasingly large libraries of genetic sequences are available, and sequencing costs are decreasing. Some steps have been taken to manage the risks from this availability, such as screening commercial synthesis orders, though more will need to be done as the industry continues to advance.
Within a year or two of their invention, many cutting-edge advances are accessible to university undergraduates who compete in the International Genetically Engineered Machine (iGEM) competition. The DIY bio movement demonstrates that some advances are accessible to people with no formal training at all (similarly to how 3D printing has increased access to some forms of manufacturing).
Again, it’s worth emphasising that there are benefits to these developments as well as risks. More people being able to sequence and synthesise genetic material means faster progress in a range of areas, such as innovative new drug therapies.↩Why would well-intentioned researchers deliberately create unnaturally dangerous pathogens? One purpose is ‘gain-of-function research,’ in which scientists try to increase the contagiousness or virulence of a pathogen in order to better understand its characteristics, including whether particular mutations should be treated as a warning sign if they occur in nature. The result is typically a slightly more dangerous pathogen that’s still well within the bounds of what virologists work with on a day-to-day basis. However, when the research is performed on potential pandemic pathogens, or particularly virulent ones, the potential to create something unnaturally dangerous becomes a concern. The most publicly prominent example of gain-of-function research was a 2011 experiment to increase the transmissibility of avian flu (H5N1) in mammals. The experiment was controversial and triggered a review by the National Science Advisory Board for Biosecurity.↩
In a 2008 survey, the median expert estimated that there was a 10% chance of 1 billion people dying in an engineered pandemic before 2100, and a 2% chance of an engineered pandemic causing extinction. The authors stress that for various reasons these estimates must be taken with a grain of salt. Nonetheless, arguments like the ones presented here suggest these numbers are relatively plausible.↩
Luke Muehlhauser’s writeup on the Industrial Revolution, which also discusses some of the deadliest events in history, reviews the evidence on the Black Death as well as other outbreaks. Muehlhauser’s summary: “The most common view seems to be that about 1/3 of Europe perished in the Black Death, starting from a population of 75-80 million. However, the range of credible-looking estimates is 25%-60%.” See footnote 1 for a table of Muehlhauser’s estimates of world fatalities due to different historical events.↩
In The Precipice: Existential Risk and the Future of Humanity, Toby Ord estimated the chance of human extinction in the next 100 years from a biological agent to be around 1 in 30. Hear Toby discuss this and other potential risks on our podcast episode about The Precipice. See also a critical review of The Precipice, which argues that the risk is much lower.↩
Here we are presenting the case for working on GCBRs. But of course whether this is the area you should focus on in your career depends (among other things) on your fit for the area as well as how it compares to others you could focus on instead. For example, as Greg Lewis points out, working to increase the chances of safe and beneficial AI seems orders of magnitude more neglected than work on GCBRs, and seems at least as important for safeguarding the future of humanity. This suggests that if your circumstances and fit are equally good for both areas, working to ensure safe and beneficial AI is likely to be the better choice between the two. (Of course, there are many other potential focus areas that might be even better for you.)↩
The only existing agreement — the Biological Weapons Convention (BWC) — lacks resources and has no verification or enforcement power. (Learn more about weaknesses of the BWC under ‘State actors and the Biological Weapons Convention‘ in our full profile on GCBRs.)↩
Much has been written on specific technologies. For example, Broad-Spectrum Antiviral Agents: A Crucial Pandemic Tool (2019). A number of the podcast episodes listed above discuss some of the most promising ideas, as do many of the papers in the “Other resources” section above.↩
Greg Lewis estimates that a quality-adjusted ~1 billion USD is spent annually on GCBR reduction. Most of this comes from work that is not explicitly targeted at GCBRs, but is rather disproportionally useful for reducing them. The US budget for health security in general is ~14 billion. Worldwide the budget is probably something like double or triple that, so spending that’s particularly helpful for GCBR reduction is probably just a few percent of the total; the spending for explicit GCBR reduction would be much less. See the relevant section of our GCBR profile, including footnote 21.
Why might people focus less on work targeted toward GCBRs, even though they are the risks of the worst catastrophes? One answer is that people with the power to allocate resources are not sufficiently aware of GCBRs, or think that they are extremely low. Another answer is short-term thinking: since the technologies most worrying for GCBRs haven’t yet been fully developed, it’s very unlikely we’ll see a biological catastrophe in the next few years; people subject to political and other pressures to prioritise the near future may therefore be less inclined to focus on them. Finally, GCBRs are a type of ‘public good‘ problem, so we generally have reason to expect them to be somewhat neglected. Read more.↩How useful more targeted work is for reducing GCBRs vs growing the broader field of biosecurity and pandemic preparedness is a matter of debate. In a recent podcast episode, Marc Lipsitch argued that the best way to address GCBRs may be to simply build up the broader field, because of the substantial overlap between biological threats of all sizes and the tools needed to combat them. In our writeup on reducing GCBRs, Greg Lewis suggested that this strategy — which he calls “buying the index” of conventional biosecurity — would probably be less effective than trying to complement the existing portfolio with work that’s particularly important for reducing GCBRs.
We suspect Greg’s view is closer to the truth, though it’s not obvious, and Greg also expresses uncertainty on the matter. We have a general heuristic: all else equal, a more targeted intervention — one whose primary goal is to make progress on a smaller number of issues — is likely to have a bigger effect on those issues than a less targeted intervention that has more goals. In pursuing the less targeted intervention, you can face more tradeoffs between the different goals, which can reduce your impact on each one considered separately.
Furthermore, with regard to this particular case: the people who have shaped the broader field of biosecurity and pandemic preparedness seem to have generally been optimising for reducing the risks of smaller, more likely pandemic outbreaks. It would be surprising if in doing so they also optimised the field for reducing GCBRs, such that just building the field, in general, was the best thing someone could do to reduce GCBRs.
That said, because there’s already a lot of support for the kinds of interventions favoured by the broader field, including interventions that do reduce GCBRs, it could in some cases be a higher impact to expand the field or in some other way make it more effective at achieving its goals. For example, if you could manage to expand the entire field by 1% in terms of funding and labour, that might easily be better than a more targeted project aimed at reducing GCBRs.
Indeed, Greg also guesses that it’s sometimes better to complement the existing biosecurity portfolio with work that’s especially useful for reducing GCBRs (but which is also helpful for addressing threats of less severe events) than it would be to explicitly target reducing GCBRs. This suggests that the argument for targeting interventions can be taken too far.↩How to prevent the next unknown would-be pandemic is, of course, an existing field of research. The COVID-19 pandemic may raise this field’s profile, or it might lower it. This depends on whether enough people are inspired to focus on preventing anything like COVID-19 (or worse) from happening in the future, or if the large majority of people prioritise combating the current crisis. The publication of this New York Times Magazine article suggests that there is mainstream interest in pursuing more broad prevention efforts.↩