Miss a day, miss a lot. Subscribe to The Defender's Top News of the Day. It's free.
By Gregory D. Koblentz and Filippa Lentzos
At the end of 2021, the Centers for Disease Control and Prevention (CDC) quietly added “chimeric viruses” — viruses that contain genetic material derived from two or more distinct viruses — to its list of most dangerous pathogens.
The CDC designated this type of research as a “restricted experiment” that requires approval from the secretary of the U.S. Department of Health and Human Services — an executive branch department of the federal government created to protect the health of Americans.
The CDC believes that immediate regulatory oversight of these experiments is essential to protect the public from the potential consequences of a release of these viruses.
The experiments would add genetic material from the original SARS virus, which first emerged in 2003, to the SARS-CoV-2 strain to create an aggressive “chimeric virus.”
We say it is “possible” that chimeric coronaviruses have been made because we simply do not know for sure. U.S. labs are not obliged to publicly report, explain, or justify such experiments. And this highlights a larger issue.
USAMRIID, once charged w/responding to Soviet Union’s biological weapons program, now conducts research on biological threats including Ebola Zika anthrax + plague. Congress in 2021 appropriated $130M for BSL-4 lab which has history of safety violations.https://t.co/SmXGgSW1Hf
— Robert F. Kennedy Jr (@RobertKennedyJr) March 2, 2022
The current approach to preventing high-risk pathogen research is piecemeal and reactive. It does not foster a larger public debate about whether the potential societal benefits of such research outweigh the very significant risks.
The world lacks a comprehensive approach to biorisk management that incorporates biosafety, biosecurity and oversight of “dual-use research,” research that is intended to provide a clear benefit, but which could easily be misapplied to do harm.
Not the only type of dangerous research
Research with dangerous pathogens is just one of many kinds of life science research with high-risk consequences.
As advances in science teach us more about the healthy functioning of humans, animals and plants, we are also inadvertently learning more about how these healthy functions could potentially be disrupted to deliberately cause harm.
For example, a recent report from the science division of the World Health Organization identified several areas of the life sciences with misuse potential.
These include gene therapy, viral vectors, genome editing, gene drives (a way of changing an entire population of a specific species by altering its genome), synthetic biology and neurobiology.
Risks of repurposing information, methods or technologies from these fields to deliberately cause harm are not adequately dealt with through current governance mechanisms and practices. Neither are the challenges from the life sciences converging with technologies like machine learning, artificial intelligence, robotics, nanotechnology and data analytics, that not only open new possibilities to enhance health, but also potentially enable greater harm to be caused more easily.
A 2021 survey of biorisk management policies around the world found that most countries do not have comprehensive, national systems for biosafety and biosecurity governance.
Even countries, like the U.S., that scored high on biosecurity and biosafety have implemented these policies poorly in practice, as exemplified by questionable oversight of gain-of-function research on potential pandemic pathogens funded by the National Institutes of Health.
Given the increasing number of countries conducting high-risk life science research and the potential global impact of accidental or deliberate misuse of the science, international standards, setting out expectations and responsibilities for safe, secure and responsible research, are urgently needed.
We are at a significant point in time, with high-risk life science research steaming ahead despite indicators and warning signs that risk assessments are not conducted comprehensively and transparently.
We cannot afford overconfidence in the safety and security practices of life scientists, their institutions, funders and publishers.
Knee-jerk regulatory responses to individual experiments, and treating safety, security and dual-use risks in isolation, must stop.
It is high time for a new global architecture for life science governance that takes a comprehensive and coherent approach to biorisk management and that revisits how high-risk life sciences research, funding and publication processes are conducted.
The world must ensure basic and applied life science knowledge, materials and skills are used for peaceful purposes and for the betterment of humans and the health of the planet.
Originally published in The Conversation.
Gregory D. Koblentz is associate professor and director of the Biodefense Graduate Program at George Mason University.
Filippa Lentzos is senior lecturer in Science and International Security and co-director at the Centre for Science & Security Studies at King’s College London.