By By Marc Lipsitch
and Tom Inglesby
February 27, 2019
Source: Washington Post
Photo Source: Unsplash, Clay Banks
In 2014, U.S. officials imposed a moratorium on experiments to enhance some of the world’s most lethal viruses by making them transmissible by air, responding to widespread concerns that a lab accident could spark a global pandemic. Most infectious-disease studies pose modest safety risks, but given that these proposed experiments intended to create a highly contagious flu virus that could spread among humans, the government concluded the work should not go on until it could be approved through a specially created, rigorous review process that considered the dangers.
Apparently, the government has decided the research should now move ahead. In the past year, the U.S. government quietly greenlighted funding for two groups of researchers, one in the United States and the other in the Netherlands, to conduct transmission-enhancing experiments on the bird flu virus as they were originally proposed before the moratorium. Amazingly, despite the potential public-health consequences of such work, neither the approval nor the deliberations or judgments that supported it were announced publicly. The government confirmed them only when a reporter learned about them through non-official channels. This lack of transparency is unacceptable. Making decisions to approve potentially dangerous research in secret betrays the government’s responsibility to inform and involve the public when approving endeavors, whether scientific or otherwise, that could put health and lives at risk.
We are two of the hundreds of researchers, medical and public-health professionals, and others who publicly opposed these experiments when they were first announced. In response to these concerns, the government issued a framework in 2017 for special review of “enhanced” pathogens that could become capable of causing a pandemic. Under that framework, reviewers must consider the purported benefits and the potential risks and, before approving the work, determine “that the potential risks as compared to the potential benefits to society are justified.”
The framework also requires that experts in public-health preparedness and response, biosafety, ethics and law, among others, evaluate the work, but it is unclear from the public record if that happened. No description of who reviewed these proposals has been provided. It is not stated what evidence was considered, how competing claims were evaluated or whether there were potential conflicts of interest. This secrecy means we don’t know how these requirements were applied, if at all, to the experiments now funded by the government. A spokesperson from the Department of Health and Human Services told Science magazine that the agency cannot make the reviews public because doing so might reveal proprietary information about the applicants’ plans that could help their competitors. This bureaucratic logic implies that it is more important to maintain the trade secrets of a few prominent scientists than to let citizens — who bear the risk if an accident happens and who fund their work — scrutinize the decisions of public officials about whether these studies are worth the risk.
As researchers, we understand the usual logic for keeping scientific grant reviews confidential. But this is not ordinary science. The overwhelming majority of scientific studies are safe; even the worst imaginable accident, such as an infection of a lab worker or an explosion, is unlikely and would harm only a handful of people. But creating potentially pandemic pathogens creates a risk — albeit a small one — of infecting millions of people with a highly dangerous virus. For this kind of research, there is no justification for keeping risk-benefit deliberations secret. Waiving confidentiality when lives are at stake is a standard practice. Drug makers must disclose many facts about their products before approval in service of protecting public health and safety. We have serious doubts about whether these experiments should be conducted at all. We also suspect that few members of the public would find compelling the rationale that the best way to fight the flu is to create the most contagious, lethal virus possible in a lab. But with deliberations kept behind closed doors, none of us will have the opportunity to understand how the government arrived at these decisions or to judge the rigor and integrity of that process.
Ultimately, public awareness is not enough. The debate in the United States over the past five years took place mainly among a small group of scientists and made only token efforts to inform or engage the wider citizenry. We need public discussion and debate about the risks and benefits of these kinds of experiments. And because viruses do not respect borders, the conversation must move beyond the national level, to coordinate the regulation of dangerous science internationally.
At stake here is the credibility of science, which depends on public support to continue. Science is a powerful driver of human health, well-being and prosperity, and nearly all of it can be done without putting populations at risk. If governments want to fund exceptionally risky science, they should do so openly and in a way that promotes public awareness and engagement.
Source: Washington Post
What disclosures should be required regarding your health, body and biological matter? What protections and transparencies should be in place? How could this impact your health?
If this article was helpful to you, donate to the Shidonna Raven Garden and Cook E-Magazine Today. Thank you in advance. Share the wealth of health by sharing this article with 3 of your family or friends today.
Comentários