In early 2012, federal officials summoned Michael Imperiale from his home in Ann Arbor, Michigan, to a large conference room in Bethesda, Maryland. There, they handed the virologist drafts of two scientific papers. A foreign government had deemed one draft’s contents so risky that it could not be sent via the postal service or attached to an email.
The drafts detailed attempts to alter a lethal avian influenza virus, potentially granting it the ability to spread among humans. Such work, according to the U.S. officials who had funded it, was vital for preparing for a potential flu pandemic. But some scientists wondered whether the research itself could spark a cataclysm. Might someone read the papers, which contained details of how the pathogens had been engineered, and use them as blueprints for bioterrorism?
Months before, Imperiale and more than a dozen colleagues had recommended that earlier drafts of the papers be published with some details redacted. But that halfway option wasn’t going to fly. Now they needed to make a choice: Was it worthwhile to publish the papers in full? Or should the manuscripts, with their potential for misuse, not be published at all?
The meeting helped launch a new era of debate over pathogen research. In the ensuing years, the field would be forced to reckon with fundamental questions about virology experiments that enhance pathogens so they become more deadly or more transmissible. The risks of the work, which is sometimes called gain of function, are difficult to calculate, and scientists have mixed views about its potential benefits.
Those debates have grown more public — and more bitter — since the beginning of the Covid-19 pandemic, which many experts acknowledge might have resulted from an accident at a Chinese laboratory with a history of biosafety issues. The lab’s researchers conducted experiments on coronaviruses, though whether any of their work constituted gain-of-function research is a matter of debate. (Virologists use the term to describe a broad range of experiments, many of which are low-risk, so some experts prefer to use terms like “gain-of-function research of concern” or “enhanced potential pandemic pathogen” research to refer to the riskiest subset of such work.)
The controversy has affected the field of virology writ large, scientists told Undark. Even before Covid-19, changing policies had cast a wide net, delaying or discouraging even lower-risk influenza and coronavirus studies. And for those whose work might be considered gain of function, the current review system is slow, opaque, and restrictive.
Policymakers, one biosecurity expert argued, are fretting about research that poses far less danger than the viruses evolving all around us in “nature’s gigantic lab.”
Concerns about Covid-19’s origins have brought calls for additional oversight of U.S. labs. That effort seems misplaced to some researchers, effectively hamstringing U.S. science in response to alleged biosafety lapses thousands of miles away. If additional rules are not carefully calibrated, they say, the country could wind up less prepared to fight future pandemics. Gigi Gronvall, a biosecurity expert at the Johns Hopkins Center for Health Security, characterized the backlash as a situation of muddled priorities: policymakers, she argued, are fretting about research that poses far less danger than the viruses evolving all around us in “nature’s gigantic lab.”
Still, other experts remain deeply skeptical of the status quo. Among them, some view the handling of Covid origins as exhibit A: Not only was there an unwarranted rush to rule out a lab leak, they say, but the nation’s top scientific leaders overstepped their roles by secretly intervening in the debate, possibly minimizing concerns about laboratory safety.
Amid lingering concerns about Covid’s origins, the Biden administration put forth a new policy in May 2024 to strengthen oversight of high-risk biological research. Congress, meanwhile, has weighed in with a bill that would strip the National Institutes of Health of its authority to fund gain-of-function research and transfer that power to an independent review panel, appointed by the president. The legislation passed out of committee with bipartisan support in late September. It is now awaiting a vote on the Senate floor. The incoming Donald Trump administration, some experts predict, is likely to support the bill.
Rep. Jim Jordan speaks alongside Rep. Steve Scalise and Rep. James Comer during a Republican-led forum on the origins of the Covid-19 virus in June 2021. Concerns about Covid-19’s origins have brought calls for additional oversight of U.S. labs, and the controversy has affected the field of virology writ large. Visual: Kevin Dietsch/Getty Images
Such efforts have brought new public attention to some of the basic questions Imperiale and his colleagues agonized over in that conference room 12 years ago: How should society weigh the costs of engineering pathogens in the pursuit of public health goals?
That day, a number of high-ranking people, including then-NIH director Francis Collins and then-senior NIH official Anthony Fauci, whose office had funded the research, argued in favor of publishing the papers. Imperiale remembers listening intently and weighing the questions: Would publishing the papers really pose a risk? Would the data help public health officials monitor for dangerous pathogens out in nature?
Ultimately, the group voted to publish both papers.
Imperiale said he found the public health argument behind sharing the information from the studies compelling. Still, the decision took a toll. “I remember going back to the airport and just being totally exhausted,” he said during a recent conversation. He found an empty area of the terminal, sat down, and experienced a kind of emotional crash. “It just felt like such a monumental decision. Because you think, what if something bad does happen?”
To prevent the spread of microbes, researchers in U.S. laboratories follow extensive biosafety protocols. They’re also required to perform background checks on laboratory workers and employ other security measures before handling certain pathogens. Just a small number of laboratories perform gain-of-function research of concern, said Imperiale, perhaps a dozen worldwide and a handful in the U.S., where it has historically been funded by the National Institutes of Health.
The same year of Imperiale’s trip to Bethesda, the U.S. government released new rules for government-funded dual-use research of concern, meaning it could cause harm if misapplied.
Despite those protocols, some scientists remained uneasy. Lone Simonsen, director of the PandemiX Center in Denmark, was working in the U.S. when news about the gain-of-function studies started to spread. She was immediately concerned. Scientists like to think of themselves as the good guys, she said. “But are we really?” she wondered. “What’s our field exactly producing out of all this?”
Around the same time, Marc Lipsitch, an epidemiologist at the Harvard T.H. Chan School of Public Health, began to question whether the risks of such research outweighed the benefits. He was preparing a lecture on a drug-resistant flu virus that had swept across the globe a few years prior. As it happened, said Lipsitch, that fast-spreading flu virus contained a mutation that virologists had previously studied in the lab. When the virologists inserted that mutation into a common flu strain, the mutation crippled the virus. But as that common flu strain continued to evolve out in the world, the single mutation that had weakened the lab virus came to confer an advantage in nature.
A colorized electron microscope image of H1N1 influenza virus particles. When a drug-resistant H1N1 flu virus emerged in the late 2000s, virologists found the virus contained a mutation that had previously been studied in a lab. Visual: NIAID/NIH
There’s no guarantee that a mutation that behaves one way in one flu strain will behave the same way in a different flu strain, said Lipsitch. And because influenza viruses evolve quickly, any laboratory findings may be obsolete, or even misleading, by the time they are published.
In his view, this had implications for gain-of-function studies. Not only did they risk sparking a devastating pandemic — their purported benefits were uncertain.
In 2014, Lipsitch co-authored a paper in PLOS Medicine, using data on how often laboratory pathogens sicken the people who work with them. With each study, there is a small but real risk that an exposure could “initiate a chain of transmission,” the authors wrote, and perhaps even spark a pandemic. The paper laid out alternative approaches that, according to the authors, could be used to obtain the information that scientists wanted.
Some prominent virologists argued that Lipsitch and his collaborator had misinterpreted the infection data, and, in the process, overstated the research’s risks. Yoshihiro Kawaoka, one of the scientists behind the controversial avian influenza studies, said that this work had already provided actionable data by, among other things, demonstrating that countries should maintain stockpiles of avian influenza vaccines. He also disputed the notion that alternative approaches were sufficient.
Lipsitch was undeterred. In July of that year, he helped organize a gathering in Cambridge, Massachusetts, of roughly 20 people, most of them academics, to discuss biosafety and advocate for change.
Scientists like to think of themselves as the good guys, Lone Simonsen said. “But are we really?”
The timing was good. In June, dozens of employees at a U.S. Centers for Disease Control and Prevention laboratory had been potentially exposed to anthrax. Then, on July 1, six vials of the virus that causes smallpox were discovered in an unsecured storage room on the NIH campus. Soon after that, news broke that the CDC had accidentally shipped out a dangerous flu virus. These incidents “raise serious and troubling questions,” said the then-CDC director at a July 11 press conference.
As all this was unfolding, the longest-serving members of the National Science Advisory Board for Biosecurity, which had made the decision to publish the avian influenza papers, received unexpected emails announcing that they were being relieved of their duties.
A “huge luck element” helped put a spotlight on the Cambridge meeting, said Arturo Casadevall, a physician and microbiologist at the Johns Hopkins Bloomberg School of Public Health who was among those cut from the NSABB. In an interview with Undark, he described reporters calling during the meeting, trying to figure out what had just happened.
That day, the group discussed what could be done. Some people wanted to call on the U.S. government to issue a moratorium on gain-of-function experiments. Others thought this was too anti-experimental and wanted to simply request that all future research undergo a rigorous risk-benefit analysis. In the end, the language in the two-paragraph consensus statement centered the need for risk-benefit assessment. Hundreds of scientists signed on.
Pandemic or otherwise dangerous pathogens — and the research conducted on them — have been the focus of growing public attention. Visual: Undark
Many virologists were upset by the suggestion that their work was unsafe. In a July 2014 episode of This Week in Virology, a podcast sponsored by the American Society for Microbiology, some of the co-hosts tore into the Cambridge Working Group.
Vincent Racaniello, a virologist at Columbia University, framed the uncertainty around gain of function’s benefits as a positive: Sometimes scientific research yields fruitful yet unexpected information. “You cannot tell scientists what to work on,” he said, “because you never know [where] the next good result is coming from.”
Dickson Despommier, now an emeritus professor of public health and microbiology at Columbia University, wondered aloud if his colleagues on the Cambridge Working Group were “a reactionary rightist group that I would classify as McCarthy-like in terms of their jaundiced view of the way science is carried out.”
During the discussion, Racaniello revealed plans to create a website called Scientists for Science where “we will just say we don’t agree with the fact that we need a risk-benefit.” Soon, the six-paragraph statement appeared online, expressing confidence in biosafety standards and calling for open and constructive debate. It, too, garnered hundreds of signatures.
Dickson Despommier wondered aloud if his colleagues on the Cambridge Working Group were “a reactionary rightist group.”
A few months later, the Obama administration issued a moratorium on new federal funding of gain-of-function studies on influenza viruses and on certain coronaviruses. The goal, according to the government, was to assess the research’s risks and benefits and then develop a new policy that could be used to guide future funding.
The news was not universally embraced.
On social media, Racaniello broadcast a sentiment shared by others in the virology community: “The Administration thinks what we need right now is to STOP research on deadly pathogens? WTF?”
The events of 2014 had far-reaching effects, many virologists say, that stifled important research.
In the wake of the moratorium, a slew of influenza projects were put on hold, including some that were later determined to be low-risk. Around the same time, the White House urged federal agencies to take immediate and long-term steps to prevent future incidents.
The new scrutiny could be tedious — and sometimes frightening.
Seema Lakdawala was a postdoctoral researcher at the NIH in the summer of 2014, when her laboratory undertook a comprehensive inventory of its hundreds of thousands of influenza samples. Each day for about a month, Lakdawala said, she and her colleagues scoured the storage freezers, inspecting every sample to ensure nothing dangerous was in a spot where it shouldn’t be.
The work was accompanied by what she characterized as an air of suspicion that spooked biosafety officers and research institutions. It started to feel like “a bit of a witch hunt,” said Lakdawala.
In 2015, when she started her own laboratory at the University of Pittsburgh, she curtailed her work with dangerous pathogens. Still, Lakdawala ran into trouble: One day in 2017, she received a shipment of avian influenza viruses that had been attenuated, or weakened, so they’d be safer for use in the lab. But the university’s biosafety officers, said Lakdawala, insisted the viruses were select agents, meaning they’re subject to special regulations. They brought the police to her lab to confiscate the viruses. Soon after, she got a call from the FBI. She decided to hire a lawyer, and in the end, she said, the university acknowledged the viruses weren’t dangerous. (In a statement to Undark, the university said that it had “followed all appropriate protocols for reviewing and securing biological materials,” adding that it had consulted with federal regulators.)
Lakdawala, now a virologist at the Emory University School of Medicine, characterized her experience as “the extreme example of what happens when we keep telling biosafety people that we have to be scared of bad actors.”
Other virologists had their research paused. Christopher Brooke, a virologist at the University of Illinois Urbana-Champaign, said he had a grant put on hold for a few months, even though the work was with a strain of influenza that he describes as low-risk. And Andrew Pekosz, a virologist at the Johns Hopkins Bloomberg School of Public Health, had all of his laboratory work on seasonal influenza viruses put on hold. He was not proposing to create a new virus, he said, but suspects that grant reviewers may have erroneously understood the work to be dangerous.
The university’s biosafety officers insisted the viruses were select agents, meaning they’re subject to special regulations. They brought the police to her lab to confiscate the viruses.
Influenza researchers must also navigate “soft and unpredictable” changes in research culture, said Anice Lowen, a virologist at the Emory University School of Medicine. These changes are driven not by formal policy, but by shifts in how rules are implemented and in what funders and peer reviewers expect, she said. The unpredictability can be particularly hard on trainees who start down a particular research path and then run into roadblocks two or three years down the road, when enforcement shifts on account of political changes, or just some bad press about pathogen research. All of a sudden, it’s unclear if the student’s work will be publishable; if it is published, the student might well be concerned about personal attacks.
Lowen and other scientists stressed that biosafety protocols are an essential feature of pathogen research. If something were to go wrong, after all, virologists would often be the first people affected. But ever-changing and poorly defined rules, some argued, may do little to improve safety while deterring people from important research.
The field of virology has become increasingly risk-averse, said Lakdawala, with scientists avoiding experiments that may be critically important to public health.
“I’m not putting myself out there,” she said, recalling her own experiences. “I don’t need to hire a lawyer or have the FBI call me. Absolutely not.”
In 2017, the funding moratorium on gain-of-function research was lifted, allowing proposals to go through if they receive the greenlight from an anonymous review committee within the Department of Health and Human Services.
According to a scientist whose work was recently evaluated by that committee, that process of review can be confusing and almost impossible to navigate. The scientist spoke on the condition of anonymity because of the potential for public backlash should their work be misunderstood or misrepresented.
Their experience, taken as a whole, offers a rare glimpse into a mysterious corner of American science policy.
In 2021, the scientist said, their team identified a hotspot: an area of the SARS-CoV-2 genome that tended to acquire mutations as the virus evolved. These mutations likely contributed to the virus’s ability to thrive among humans, the researcher said. And a better understanding could help with surveillance or with the development of vaccines and therapeutics.
After receiving approval from a university biosafety committee, the team used an approach called reverse genetics to build two types of viruses. The first resembled the original Wuhan virus, but for safety purposes, it was slightly weakened. The research team then took this hobbled Wuhan virus and used it to create a second type of virus by inserting the hotspot mutations.
For all of Undark’s coverage of the global Covid-19 pandemic, please visit our extensive coronavirus archive.
In Petri dishes, the virus with the mutations replicated better, and in animal models, it caused more disease. (Neither virus replicated as well as the original Wuhan strain, said the researcher.)
In February 2022, the scientific team applied for funding from NIH to study why the mutations made the virus better at spreading. A panel of experts determined that the proposal was worth funding, but they said it first needed to go through the anonymous review process for potential gain-of-function research, which is often referred to as P3CO, short for Potential Pandemic Pathogen Care and Oversight.
The research team disagreed that their proposal called for the special review, arguing that the weakened Wuhan virus couldn’t compete with the strains of Covid-19 now circulating freely around the world. But NIH overruled them.
The process was slow and confusing, the researcher said. The P3CO review committee requested a multitude of documents related to biosafety, but beyond this, there was no template for the team to follow: “We had to craft what we thought they wanted.”
Ferrets are often used as animal models in influenza and Covid research, including in gain-of-function studies. Visual: Moment/Getty Images
In January 2024, after hundreds of hours of work and more than a year of back-and-forth with government officials, the team submitted their materials. After further exchanges, they learned in July that their proposal had been rejected by the committee, in part because so much time had passed and it wasn’t clear the results would still be useful.
The ruling baffled the researcher. “We’re not trying to make the virus transmit better. We’re not trying to make it more deadly,” they said. “These are mutations that exist in nature. We’re trying to figure out what they do.”
(A Health and Human Services spokesperson, Spencer Pretecrum, did not address specific concerns the researcher raised about the P3CO review process. In a statement that included a link to the P3CO framework, he wrote that the review process “weighs the potential public health benefits against the potential biosafety and biosecurity risks, and the appropriate risk mitigation strategies.”)
The ruling baffled the researcher. “We’re not trying to make the virus transmit better. We’re not trying to make it more deadly.”
At the request of Undark, Imperiale reviewed a description of the proposal and the review timeline, as the researcher had laid them out in his interview. Imperiale said he would need more information to weigh in on whether the work truly crossed the line into research of concern. He was clear, though, that the timeline had been too slow: “You can’t have this situation where they say, ‘Okay, we need to review it,’ and they get back to you, like, three months later.”
At this point, the researcher said, scientists doing this kind of work “have no idea what’s allowed and what’s not.” This regulatory uncertainty, coupled with drawn-out timelines, creates a huge disincentive. “But these rules aren’t everywhere,” the scientist noted. And if U.S.-funded research teams aren’t going to do the work, wondered the scientist, then who is? And what rules are they following?
In early 2020, as Covid-19 raced around the world, some pathogen researchers privately wondered whether a laboratory leak had sparked the pandemic. The Wuhan Institute of Virology, just miles from the market linked to many of the first reported Covid-19 cases, had used NIH funds to do work on coronaviruses, and American biosafety experts had helped train its personnel. Both Fauci and Collins participated in discussions that acknowledged the possibility of a lab leak, but those discussions were not made public until more than a year later, when they were released in the wake of a Freedom of Information Act lawsuit.
For some longtime critics of laboratory safety practices, it has come to look as if virologists and their funders were resisting both self-reflection and outside input. “Even if there was a hint that their work could cause a calamity like Covid — a global deadly pandemic like Covid — they should at least look inward and say, ‘Could we have been responsible for this? What is our role in this?’ But no, they are not,” said Laura Kahn, who spent 15 years as a research scholar at Princeton University’s Program on Science and Global Security and whose book on Covid-19 was recently published by Johns Hopkins University Press.
Instead, Kahn said, prominent virologists made the lab leak hypothesis taboo. “You had some of these really distinguished scientists writing letters in distinguished journals saying that even the hint that it might be a lab leak is appalling.”
Those letters failed to deflect concerns. Instead, by 2021, perceptions of a cover-up helped to launch debates about laboratory safety squarely into the messy realm of partisan politics.
For some longtime critics of laboratory safety practices, it has come to look as if virologists and their funders were resisting both self-reflection and outside input.
By early 2021, more experts were saying publicly that Covid-19 may have emerged from a laboratory accident in China. Towards the end of 2021, Sen. Roger Marshall, a Kansas Republican, introduced a bill to further restrict gain-of-function research. In a press release peppered with references to Anthony Fauci and “Communist China,” Marshall asserted that the U.S. government “should not provide another dime in funding” for such experiments until more was known about Covid’s origins.
Over the next two years, Republicans would attempt to curtail gain-of-function studies through federal bills and through state-level legislation in Texas; Wisconsin; and Florida, where it passed into law in 2023.
This year, Democrats joined their Republican colleagues in expressing concern about the recipients of some NIH funding. In a May hearing held by the House Committee on Oversight and Accountability, lawmakers grilled Peter Daszak, head of the nonprofit EcoHealth Alliance, which had used NIH dollars to subcontract coronavirus research at the Wuhan Institute of Virology. While Democrats stressed that the committee’s investigation had uncovered no evidence that EcoHealth research actually sparked the pandemic, they raised pointed questions about the group’s failure to meet reporting requirements.
Two weeks after the hearing, the Biden administration suspended EcoHealth’s active NIH grants and proposed barring the group from future federal research. A top aide to Anthony Fauci was also put on administrative leave after the subcommittee uncovered documents showing that the aide had used a private email account to communicate with Daszak and other proponents of the zoonotic spillover hypothesis, in an attempt to avoid FOIA requests.
EcoHealth Alliance President Peter Daszak attends a House Committee on Oversight and Accountability hearing the in May. Two weeks later, the Biden administration suspended EcoHealth’s active NIH grants and proposed barring the group from future federal research. Visual: Andrew Harnik/Getty Images
Perhaps no group epitomizes the fiery tone on Capitol Hill better than Biosafety Now, which was launched in 2023 to advocate for strict, transparent, and independent regulation of high-risk pathogen research and for a rigorous investigation into the origins of Covid-19. Biosafety Now counts more than 20 scientists and science policy experts among its leaders and advisers, including Kahn and Richard Ebright, a Rutgers University microbiologist and longtime critic of U.S. biosafety policy. The group has lobbied members of Congress to regulate pathogen research, and it played a prominent role in promoting the Wisconsin bill.
Bryce Nickels, a molecular biologist at Rutgers University who co-founded the group, said their approach differs from that of people like Harvard’s Marc Lipsitch, who have taken a more measured stance. “You have had for years a group of insiders that are deciding policies and trying to make policies where the goal is to not hinder research,” he said. As a result, even people who advocate for stronger biosafety standards end up promoting proposals that aren’t particularly strict.
“Many of us believe that there was a pandemic that came out of research that killed millions of people, and nothing has changed in this area for years. So of course we’re yelling.”
Nickels characterized his view as impolite, and he readily volunteered that many academics and biosecurity professionals oppose his preferred policies, which include strict fines for researchers who don’t comply with biosafety standards. (Nickels’ and Ebright’s social media tactics, including ad hominem attacks against other academics, have raised eyebrows, and the two were the subject of a March conduct complaint to Rutgers. Nickels disputed the complaint’s specific allegations in an X thread, and he told Undark that scientists have lobbed abrasive comments his way, too.)
Nickels (who is leaving Biosafety Now at the end of the year) defended his rhetorical approach by invoking the high stakes of the issue. “Many of us believe that there was a pandemic that came out of research that killed millions of people, and nothing has changed in this area for years,” he said. “So of course we’re yelling.”
As lawmakers and others have paid more attention to pathogen research, the response from virologists has at times echoed Racaniello’s and Despommier’s comments years earlier — saying that outsiders were unfairly attacking the field by painting a skewed picture of virology as the Wild West.
In early 2023, the Journal of Virology published a commentary co-authored by dozens of virologists. During the Covid-19 pandemic, virologists had contributed on many fronts, yet their field was in the spotlight due to lingering questions about the origin of SARS-CoV-2, the authors wrote. Congressional hearings, they said, threatened to “add fuel to an anti-science, fear-based movement,” while restrictive legislation could hamper the country’s ability to respond to future viral threats.
Felicia Goodrum, a virologist at the University of Arizona College of Medicine and a lead author of the commentary, said many of her colleagues have grown hesitant to speak publicly on these issues. “It was mostly the fear of retaliation, and harassment and attacks,” she said. Particularly on social media, things can get ugly: “Lots of slandering, name calling, people saying that you should be dead.”
On top of this, she said, Anthony Fauci and other scientists had been treated very poorly when giving testimony in Congress: “It’s just really quite unsettling to see how scientists are being attacked for their work and accused of criminal intent.”
Congressional hearings, dozens of virologists wrote, threatened to “add fuel to an anti-science, fear-based movement,” while restrictive legislation could hamper the country’s ability to respond to future viral threats.
In Goodrum’s view, the recent bills are premised on a flawed understanding of Covid’s origins, specifically the notion that the virus leaked from a laboratory. “The problem with that is while that is a plausible hypothesis, it is not probable, especially now, given the data that we have supporting a zoonotic origin,” she said.
Still, bipartisan legislation has moved forward. On Sept. 25, the Risky Research Review Act passed out of the Senate Homeland Security & Governmental Affairs Committee. The bill would establish an independent oversight board whose members are appointed by the president and whose approval would be necessary for any federally-funded research that could enhance certain pathogens’ ability to spread and cause disease.
The legislation, which was introduced and championed by Sen. Rand Paul, a Kentucky Republican, received support from 14 of the 15 committee members and was characterized as “a good compromise” by the committee chair, Michigan Democrat Gary Peters.
Sen. Rand Paul questions Anthony Fauci during a Senate subcommittee hearing about the government’s response to the Covid-19 pandemic in November 2021. Paul later introduced the Risky Research Review Act, which passed out of committee with bipartisan support in late September. The legislation would strip the NIH of its authority to fund gain-of-function research and transfer that power to an independent review panel, appointed by the president. Visual: Chip Somodevilla/Getty Images
A Paul spokesperson told Undark that “we consulted with dozens of scientists across the country, some who have commented publicly and some who have not” in developing the legislation. Nickels told Undark that he and Ebright were among those scientists. Additionally, several researchers affiliated with Biosafety Now endorsed the bill in a September press release.
It remains to be seen whether the Risky Research Review Act will make it to the Senate floor. For now, some biosafety experts and professional societies are expressing hesitation.
In an October blog post, three prominent biosafety professionals characterized the legislation as “deeply flawed.” Among other things, they wrote, the new board would have authority to review all life sciences research proposals — not just those that are high-risk — giving it significant veto power over a large swath of federally-funded science. And the new board, they wrote, would be vulnerable to political pressure from Congress.
“This is actually a case where the integrity and future of the life sciences could be imperiled.”
The American Society for Microbiology opposes the bill. “What does this legislation add to the mix, especially when federal agencies are in the process of implementing new guidance from the White House?” asked Amalia Corby, the organization’s director of federal affairs.
That guidance, introduced in May, is scheduled to go into effect next year. Among other changes, the policy would expand the kinds of experiments that require external review, and it would increase reporting requirements for biosafety or biosecurity breaches. Several scientists described the new federal policy as a step in the right direction. But with an incoming Trump administration, and with Republican majorities in the House and Senate, the guidelines’ future is now unclear.
Compared to a federal law, “this policy is like smoke,” said Nicholas Evans, a bioethicist whose book on gain-of-function research will be published next year by MIT Press. Evans believes that the Risky Research Review Act is now more likely to pass under Republican governance. And if this happens, Evans predicted, Rand Paul and Donald Trump might work together to ensure that the new review board supersedes any federal guidelines. Such a scenario, said Evans, opens the door to political overreach: “This is actually a case where the integrity and future of the life sciences could be imperiled.”
“I actually have optimism. I’m encouraged by what I’ve seen.”
In July 2014, as a newly-minted Ph.D., Evans hitched a ride from Philadelphia to Cambridge, where he spent the night at Lipsitch’s house. There, Evans wrote the first draft of the Cambridge Working Group’s consensus statement. Now, more than a decade later, he thinks it might be time for his side and the Scientists for Science side to bury the hatchet, come together, and say, “whatever we disagree about, we agree that the Risky Research Act is not the way to go.”
Nickels, meanwhile, is decidedly sanguine. In a post-election interview with Undark, he offered names of people whom he’d like to see appointed to a federal review board, and he explained that he sees the Risky Research Review Act as a de facto ban on gain-of-function research.
“I actually have optimism,” he said. “I’m encouraged by what I’ve seen.”