The Russian State Virology Research Center in the town of Koltsovo in Siberia has one of the largest collections of dangerous viruses anywhere in the world. During the Cold War, the laboratory developed biological weapons and defenses against them, and reportedly stored dangerous strains of smallpox, anthrax and Ebola among other viruses.
So many people were concerned when an explosion erupted through the facility on Monday.
According to Russian independent media, the laboratory was repaired when a gas cylinder exploded, igniting a 30-square-foot fire that left a severely burned worker. The glass throughout the building is reported to have been destroyed by the explosion and the fire spread through the building's ventilation system.
The laboratory is one of only two in the world for which there is still a smallpox sample that was eradicated by wildlife in 1
Experts claim that in certain circumstances an explosion may lead to the release of deadly pathogens. "Part of the explosion wave will take it far from where it was first stored," said Joseph Kam, associate professor at the Stanley Ho Center for Infectious Diseases at the Chinese University of Hong Kong, CNN,
That said, procedures for storing deadly pathogens such as measles are extremely strict. The mayor said there was no threat to the population and a spokesman for the center said no dangerous pathogens were stored in the area where the blast occurred. (Of course, Russian public safety incident reports are not always accurate.)
Will hazardous laboratory diseases escape and infect the general population? Almost certainly not; the vast majority of laboratory accidents, even serious laboratory incidents, have not caused any disease and no one has yet caused a human pandemic.
But that doesn't mean we shouldn't be paused. Open explosions are relatively rare, but catastrophic accidents that release dangerous pathogens are, in fact, shockingly common – not only in Russia, but also in the US and Europe. From accidental exposure to smallpox and anthrax to the wrong transmission of deadly flu strains, slips with some of the world's most dangerous substances occur hundreds of times each year.
What should we do about it? The answer is certainly not that we should turn to virology and pathogen research – studies that have saved countless lives. For example, by studying the Ebola virus, researchers have been able to develop the current cocktail of Ebola treatment that can reduce it from a death sentence to a mild, treatable disease.
But our experience of disasters like what just happened in Russia suggests that some types of research – to make pathogens more lethal, say – may not be worth the risk. As viruses continue to escape in the laboratory – in the event of accidents, fires, explosions, equipment malfunctions and human errors – we run the risk of catastrophe. And we could reduce that risk without significantly hindering critical science.
In 1977, the last case of smallpox was diagnosed in the wild. That moment came at the end of a decade of a campaign to eradicate smallpox – a deadly infectious disease that killed about 30 percent of those infected with it – from the ground. About 500 million people died of smallpox in the century before being destroyed.
But in 1978, the disease returned to Birmingham, United Kingdom. Janet Parker is a photographer at the Birmingham Medical School. When she developed a horrific rash, doctors initially disappeared as chicken pox. But Parker got worse and was admitted to the hospital, where testing found smallpox. She died several weeks later.
How did she get the disease that needed to be eradicated?
It turned out that the building in which Parker works also contains a research laboratory, one of the handfuls where smallpox has been studied by scientists trying to contribute to the eradication effort. Somehow smallpox escaped to the lab to infect an employee elsewhere in the building. Through sheer luck and swift response from health authorities, including a quarantine of more than 300 people, the deadly mistake did not turn into a blatant pandemic.
Can something like this happen today?
Across the world, biological research laboratories are dealing with deadly pathogens, some with the potential to cause a pandemic. Sometimes researchers make pathogens even more deadly in the course of their research (as Science Magazine reported this spring, the US government recently approved two such experiments after keeping them for years)
. the same Russian virology lab that just underwent the explosion was the scene of another incident: a scientist died after accidentally contracting Ebola. It took several weeks for Russia to acknowledge the event.
Virus studies can help us develop drugs and understand the development of the disease. We cannot do without this research. And there are many safeguards in place to ensure that the survey does not endanger the public. But as a long series of incidents stretching from 1978 until the explosion on Monday in Russia shows, the restraint has sometimes gone awry.
How pathogens can find their way out of the laboratory
The US Government controls research on "selected agents and toxins" that pose a serious threat to human health, from bubonic plague to anthrax. There are 66 selected agents and toxins regulated under the program and nearly 300 laboratories approved to work with them.
The study of pathogens and toxins allows us to develop vaccines, diagnostic tests and treatments. New biological techniques also allow for more controversial forms of research, including making diseases more virulent or more deadly to predict how they can mutate in nature.
So this research can be a really important and critical part of public health efforts. Unfortunately, the facilities that do such work can also be hit by a serious problem: human error.
The death of smallpox from 1978 was, most analyzes discovered, caused by negligence – poor safety procedures in laboratories and poorly designed ventilation. Most people would like to think that we are not so careless today. But the terrible accidents – caused by human errors, software errors, maintenance issues and combinations of all of the above – are hardly past, as the Russian incident shows.
In 2014, as the Food and Drug Administration (FDA) cleaned up for a planned relocation to a new office, hundreds of viral sample vials were discovered in a cardboard box in the corner of a cold storage room. Six of them turned out to be bottles of smallpox. No one had followed them; nobody knew they were there. They may have been there since the 1960s.
Panic scientists put the materials in a box, sealed it with a transparent packing tape and transferred it to a supervisor's office. (This is not approved for the handling of hazardous biological materials.) It was later found that the integrity of one vial was compromised – thankfully not one containing a deadly virus.
The incidents of 1978 and 2014, such as the disaster in Russia, took their toll. attention since they involved smallpox, but cases of unintentional exposure to controlled biological agents are actually quite common . Hundreds of incidents occur every year, although not all include potentially pandemic pathogens.
In 2014, a researcher accidentally contaminated a vial of fairly harmless bird flu with a far more lethal strain. The deadly bird flu was then sent across the country to a laboratory that was not authorized to handle such a dangerous virus where it was used to test chickens.
The error was only discovered when the Centers for Disease Control and Prevention (CDC) conducted a broad investigation following a different error – the potential exposure of 75 federal employees to living anthrax after laboratory, which had to inactivate anthrax samples randomly prepared activated ones.
CDC Agent and Toxin Selection Program requires "theft, loss, release, occupational exposure, or release beyond primary bio-content barriers" of agents on its watch list immediately, between 2005 and 2012 The agency received 1,059 release messages, an average of every few days.
Now the vast majority of these errors have never infected anyone. And although 1,059 are many accidents, this actually reflects a fairly low accident rate – working in a laboratory for controlled biological agents is safe compared to many professions, such as freight or fishing.
But a freight or fishing accident will at worst kill several dozen people, while an incident with a pandemic pathogen could potentially kill several million. Given the bets and the worst-case scenarios, it is difficult to look at these figures and conclude that our disaster relief is sufficient.
The Challenges of Safe Management of Pathogens
Why Work with Labs Without Such Errors is So Hard?
A review of CDC records of errors in Select Agent content helps answer this question. Mistakes come from many directions. With alarming frequency, people deal with live viruses thinking they have been deactivated.
Technology, which is a critical part of the restriction process, may fail unexpectedly. It is not that there is a single "problematic" element of technology – it is that there are so many who are part of the restriction process and all have a low risk of failure.
These problems are not found only in the USA. In the United Kingdom, a recent investigation found :
more than 40 accidents at specialist laboratories between June 2015 and July 2017, one every two to three weeks. Beyond the infections that spread the infections were the blunders that led to the dengue virus – which kills 20,000 people worldwide every year – published by mistake; personnel handling potentially deadly bacteria and fungi with inadequate protection; and one case where students at the University of the West of England inadvertently study microbes causing live meningitis, which they believe were killed by heat treatment.
Severe Acute Respiratory Syndrome, or SARS, had an outbreak in 2003. Since then, it has not recurred in nature, but there have been six separate incidents from it that have escaped from the laboratory: one in Singapore, one in Taiwan and four times in one laboratory in Beijing.
"These narratives of escaped pathogens have common themes," argues an analysis of the failure of containment by medical historian Martin Furmansky in Bulletin of Atomic Scientists . standard biocontent, as shown in Great Britain in smallpox [case] … The first infection or index case occurs in a person who does not work directly with the pathogen that infects it, as in smallpox and SARS escapes. staff and poor oversight of the labo rotational procedures negate national and international biosecurity policy efforts, as demonstrated by the avoidance of SARS and smallpox. "
It is easy to see why these problems are difficult to solve. more rules for pathogens will not help if the people who are infected are not usually those who handle the pathogens. Adding more federal and international regulations will not help if the rules are not followed consistently. And if there are still unrecognized technical flaws in the restraint standards, how would we know until an incident made those flaws apparent?
This is a concern that has recently been back in the news because the US government has approved research aimed at making certain viral influenza viruses more virulent – that is, to facilitate their spread from person to person. The participating researchers want to learn more about portability and virulence to help us better combat these diseases. Laboratories conducting such research have taken unusual steps to ensure their safety and reduce the risk of outbreaks.
But have they reduced it enough? "We imagine that when there is an accident, it is because the ventilation system is down or someone just forgot to do something, or that it is something like a mechanical or human error that can be avoided," Mark Lipschic, professor of epidemiology at Harvard.
However, many recent failures do not fit this pattern. "Rather, people did something they thought was right and neutralized a dangerous pathogen by killing it, and in fact they still have a dangerous pathogen or contamination with a dangerous pathogen," he said. "My concern is not that any of these people will do something that is stupid or reflects poor training. My concern is that there will be human error of the kind that cannot actually be avoided. "
Lipsic does not think that we should tighten the standards for most studies. He argues that our current approach, although its level of error will never be zero, is a good balance between scientific and global health problems with safety – that is, for most pathogen biologists. But for the most dangerous pathogens, those who have the potential to cause a global pandemic, he points out that this is not the case.
So far, too many biosecurity policies are reactive – tightening standards after something goes awry. Considering how bad they can go wrong, this is not good enough. It will be extremely challenging to make our labs safer, but when it comes to the most risky pathogens, we just have to deal with the challenge.
Sign up for the Future Perfect newsletter. Twice a week, you will get a general idea and solutions to address our biggest challenges: improving public health, reducing human and animal suffering, alleviating catastrophic risks, and, more simply, improving doing for good.