There’s a certain glamor associated with science. Many imagine it as a mystical process that happens in far away, sterile labs packed with 200 IQ geniuses, who bestow their creations and discoveries on the public after years of solitary, intensive contemplation. But this description of the Ivory Tower is, in fact, a myth. Science is not something that happens at great remove; it happens in our local communities and contexts. And because the way that the scientific community conducts itself has far reaching consequences for all of us, public-policy makers should take a close look at the rules and practices surrounding scientific research here in North Carolina.  

North Carolina is the fourth largest hub of biotech in the country and is leading the way in pushing the frontiers of biology. This is an area with incredible potential, but also comes with risk; as the field grows, the number of accidents that are likely to occur from normal, day-to-day operations increases as well. This is especially worrying given the track record of public and private labs conducting risky, irresponsible experiments.  

Even after the global pandemic — which very possibly originated from an accidental leak at a Chinese lab — researchers in labs around the world, including the U.S., continue to conduct experiments that enhance the virality and danger of viruses. Researchers at Boston University, for example, recently created a more lethal form of the omicron virus, an experiment that was rightfully criticized as reckless and dangerous.

Sen. Roger Marshall of Kansas described the lunacy of this well: “It is unconscionable that NIH sponsors this lethal gain of function virus research through Boston University and EcoHealth Alliance in densely populated areas… It has never been more important that additional guardrails on gain-of-function research are established to make sure nothing like this ever happens again.” 

In our own backyard North Carolina university labs in the past have proposed similar dangerous genetic engineering experiments, and, as Pro Publica reported, there have been a series of near misses at a UNC virology lab that were not properly disclosed, saying that “Reports indicate UNC researchers were potentially exposed to lab-created coronaviruses in several incidents since 2015.”

After ProPublica filed a complaint about UNC’s removal of the virus names, NIH said it contacted UNC “to remind them of their obligations.” But UNC still didn’t release the pathogen names and genetic modifications it removed from the records. 

It’s crucial for N.C. as a leader in biotech to prioritize safe lines of research, disclose risks, and work to minimize accidents. 

Importantly, it’s not that accidents should be punished. Rather, because it’s complex work, they should be expected, and should lead to improvements *across the field* that reduce the chance of them happening again. In essence, biolabs need to be “high reliability organizations.” These are institutions that have a risk-conscious mindset and understand that to prevent major disasters, minor accidents and near misses need to be treated as cues to create better, more-robust systems. A quintessential example is airlines — transparency and shared learning from mistakes is one of the key reasons fatalities have decreased 95% over the last 25 years

However, this type of culture only happens from conscious, deliberate action — fear of bad press and a reluctance to set internal limits deters labs from transparency. And since it’s not in any single lab’s best interest to embrace these principles and be the first lab to publicly admit mistake, the field as a whole (and the public, through increased risk) suffers. 

This is where our local representatives are best placed to help. Locals have a level of skin in the game when it comes to accidents that a distant bureaucrat doesn’t — it’s a lot more important to get transparency and to promote a culture of safety when it’s happening in your backyard! Moreover, local action is more likely to strike the right balance between promoting good practices and avoiding cumbersome, ineffective regulation. 

State legislators with simple and smart legislation can set a framework for labs to follow that would be a great step towards safety and trust: 

  • Require labs that work with health hazards (BSL-2 plus labs, and certainly BSL-3 and -4 labs) to annually commit to following industry best practices, or to state how they intend to reach best practices. 
  • Require a declared, public biosafety officer responsible for ensuring that these practices are in fact followed. 
  • Mandate the sharing of incident and accident reports with each other, the scientific community, and the public. 

These are minimally burdensome — for example best practices dictate that all should already have internal accident reports — and focus on setting a collective standard in such a way that accidents in a single lab end up making the entire industry smarter and safer. N.C. can and should lead the way in building this type of strong culture of responsible science.