In the Manhattan Project of the early 1940s, a group of scientists from all over the world raced against time to save society. Today, every moment counts in a similarly important scientific effort: the fight against the pandemic of the novel coronavirus disease, COVID-19, that is sweeping the world and that represents one of the greatest challenges of our lifetimes. Scientists are working around the clock to analyze the virus to find new treatments and cures, predict how it will propagate through the population, and make sure that our supply chains remain intact.
The U.S. Department of Energy’s (DOE) Argonne National Laboratory is playing a key role in all of these activities, bringing the power of its scientific leadership and state-of-the-art user facilities to bear in the global battle against COVID-19. By partnering with other laboratories and research institutions around the world to attack the virus on every front, Argonne is doing its part to save lives and protect U.S. prosperity and security.
No inhibitions in the search for an inhibitor
The key to defeating COVID-19 is finding what truly is a biochemical key — an inhibitor molecule that will sit just right in the nooks and crannies of one or more of the 28 viral proteins that make up SARS-CoV-2, the virus that causes COVID-19. Discovering which keys have a chance of working requires a technique called macromolecular X-ray crystallography, in which tiny crystals of either pure protein or protein and inhibitors are grown and then illuminated by a high-brightness, high-energy X-ray beam.
These X-ray beams are unlike anything available at a doctor’s office and exist only at a few specialized sites around the world. Argonne’s Advanced Photon Source (APS), a DOE Office of Science User Facility, is one of them.
By mid-March, researchers from around the country had used APS beamlines to characterize roughly a dozen proteins from SARS-CoV-2, several of them with inhibitors.
“The fortunate thing is that we have a bit of a head start,” said Bob Fischetti, life sciences advisor to the APS director. “This virus is similar but not identical to the SARS outbreak in 2002, and 70 structures of proteins from several different coronaviruses had been acquired using data from APS beamlines prior to the recent outbreak. Researchers have background information on how to express, purify and crystallize these proteins, which makes the structures come more quickly — right now about a few a week.”
One of the research teams performing work on SARS-CoV-2 includes members of the Center for Structural Genomics of Infectious Diseases (CSGID), which is funded by NIH’s National Institute of Allergy and Infectious Diseases (NIAID). The team is led by Karla Satchell from Northwestern University and Andrzej Joachimiak of Argonne and the University of Chicago. Other members involved in the work include Andrew Mesecar from Purdue University and Adam Godzik from the University of California, Riverside. They have used APS beamlines 19-ID-D, operated by the Argonne Structural Biology Center, supported by the DOE Office of Science, and 21-ID, operated by the Life Sciences Collaborative Access Team, a multi-institution consortium supported by supported by the Michigan Economic Development Corporation and the Michigan Technology Tri-Corridor.
Another group, led by M. Gordon Joyce at the Henry M. Jackson Foundation for the Advancement of Military Medicine, Inc. (HJF) at the Walter Reed Army Institute of Research (WRAIR) is studying antibody and antiviral compounds. They are using beamline 24-ID, which is operated by the Northeastern Collaborative Access Team, which is managed by Cornell University and seven member institutions.
According to Fischetti, the breakneck pace of collaborative science with one common essential goal is unlike anything else he has seen in his career. “Everything is just moving so incredibly fast, and there are so many moving pieces that it’s hard to keep up with,” he said.
Fischetti compared finding the right inhibitor for a protein to discovering a perfectly sized and shaped Lego brick that would snap perfectly into place. “These viral proteins are like big sticky balls — we call them globular proteins,” he said. “But they have pockets or crevices inside of them where inhibitors might bind.”
By using the X-rays provided by the APS, scientists can gain an atomic-level view of the recesses of a viral protein and see which possible inhibitors — either pre-existing or yet to be developed — might reside best in the pockets of different proteins.
The difficulty with pre-existing inhibitors is that they tend to bind with only a “micromolar” affinity, which would require extremely high doses that could cause complications. According to Fischetti, the research teams are looking for an inhibitor that would have a nanomolar affinity, enabling it to be administered as a drug that would have many fewer or no side effects.
“This situation makes clear the importance of science in solving critical problems facing our world,” said APS Director Stephen Streiffer. “X-ray light sources, including the APS, our sister DOE facilities, and the light sources around the world, plus the researchers who use them are fully engaged in tackling this dire threat.”
Computing the COVID-19 crisis
Researchers can accelerate a significant part of inhibitor development through the use of supercomputing. Just as light sources from around the world, including the Diamond Light Source in the United Kingdom, have banded together to solve SARS-CoV-2 protein structures, so too have the top supercomputers turned their focus to the challenge at hand.
As part of the COVID-19 High Performance Computing Consortium, recently announced by President Trump, researchers at Argonne are joining forces with researchers from government, academia, and industry in an effort that combines the power of 16 different supercomputing systems.
At Argonne, researchers using the Theta supercomputer at the Argonne Leadership Computing Facility — also a DOE Office of Science User Facility — have linked up with other supercomputers from around the country, including Oak Ridge National Laboratory’s Summit supercomputer, the Comet supercomputer at the University of California-San Diego, and the Stampede2 supercomputer at the Texas Advanced Computing Center. With their combined might, these supercomputers are powering simulations of how billions of different small molecules from drug libraries could interface and bind with different viral protein regions.
“When we’re looking at this virus, we should be aware that it’s not likely just a single protein we’re dealing with — we need to look at all the viral proteins as a whole,” said Arvind Ramanathan, a computational biologist in Argonne’s Data Science and Learning division. “By using machine learning and artificial intelligence methods to screen for drugs across multiple target proteins in the virus, we may have a better pathway to an antiviral drug.”
The databases of potential drug candidates available to researchers are truly immense — they include catalogs of small molecules that number in the hundreds of millions to billions. Running individual simulations of each and every drug candidate for each viral protein, even with the supercomputers running 24/7, would take many years — a window of time that scientists don’t have.
To zero in on the most likely candidates as efficiently as possible, computational biologists are using machine learning and artificial intelligence techniques to do a kind of educated filtration of possibilities. Ten billion configurations are quickly whittled down to 250 million poses that the models attempt to fully dock. These 250 million docking poses are then further refined to roughly six million positions that are fully configured for computationally intensive molecular dynamics simulations.
At the end of the day, the ultimate goal of this computational modeling is to identify which inhibitor candidates can be fed back to scientists at the APS to attempt to co-crystalline with viral proteins. “It’s an iterative process,” said Rick Stevens, associate laboratory director of Argonne’s Computing, Environment and Life Sciences directorate. “They feed structures to us, we feed our models to them — eventually we hope to find something that works well.”
Additionally, researchers are constructing epidemiological models to simulate the spread of COVID-19 through the population. These agent-based models are taking into account on-the-fly reports of the properties of the virus’s virulence that are being published every day in the scientific literature.
The agent-based model that Argonne researchers have developed includes almost 3 million separate agents, each of whom can travel to any of 1.2 million different locations. The actions of each agent are determined by hourly schedules — like a trip to the gym, or going to school.
Currently, the Argonne team is developing a baseline simulation — in essence, to see what would happen to our communities if people carried on with business as usual. But the true goal is to be able to extensively model the various interventions — or possible additional interventions — that decisionmakers can implement in order to slow the virus’s spread.
“Our models simulate individuals in a city interacting with each other,” said Argonne computational scientist Jonathan Ozik, who helps to lead Argonne’s epidemiological modeling research. “If there’s a school closure, we see people who are supposed to go to school not go to school, and we can look at population level outcomes, such as how does the school closure affect how many people get exposed to the virus.”
The advantage of having a computer model of an entire city is that it represents an in silico laboratory for decision-makers to see how different decisions might affect a population without actually having to implement them. “Knowing what decisions to make on a regional or national scale and when are crucial in this worldwide fight,” said Argonne distinguished fellow Charles (Chick) Macal, who also leads the research. “We’re developing a model that will help give information about what decisions will be most effective.”
The team includes Argonne software engineer Nick Collier and computer scientist Justin Wozniak, providing critical expertise in deploying the large-scale computational experiments needed for the effort on DOE leadership computing resources.
Securing our resource lifelines
COVID-19 is unlike any disaster recently experienced by our country. Hurricanes, earthquakes, and wildfires each have epicenters of destruction. For a disaster like this that is being experienced so broadly, researchers are taking a deeper focus on supply chain management needed to steer our country through this difficult time.
Argonne is home to the National Preparedness Analytics Center (NPAC), which is currently helping state emergency management agencies conduct rapid analyses around supply chain resilience in the context of pandemic planning.
Supply chains involve the ways in which we are able to acquire the necessities of life, from groceries to medication. “When you’re talking about supply chains, you have to talk scale,” said NPAC director Kyle Pfeiffer. “They’re globally interconnected, and they consist of nodes and links. For example, a supply node might be a distribution center, a link might be a roadway or transportation network, and a destination node might be a grocery store. Beyond the grocery store there are upstream dependencies, from farmers growing grain to someone baking bread.”
In 2015, NPAC developed a tool called the Grassroots Infrastructure Dependency Model (GRID-M), which was funded by the Federal Emergency Management Agency and deployed by about 10 state and local emergency management agencies. This tool later won an R&D 100 award in 2018.
GRID-M was originally designed for hazards that have what Pfeiffer called a “kinetic impact” to supply chains, such as fires or hurricanes. Two features of the pandemic may make it qualitatively different from previous disasters — withstanding enduring loss of workers and possibly establishing quarantine zones. Several states have reached out to the laboratory to see how a new version of GRID-M that incorporates these facets would work.
“Our national response to this disaster is different,” Pfeiffer said. “With pandemics, enduring worker attrition is not something that we can easily plan for. In addition, the duration of this disaster could be quite long, and we’re going to have incident support operations for some time.”
According to Pfeiffer, the new model allows planners to test hypotheses on-the-fly as they plan for potential exclusion, or quarantine, zones. In this way, they can ensure that there are critical private sector services within those zones, and allow some people to leave those zones to get to work if necessary or deliver goods and services without adding significant risk to community transmission of COVID-19.
With the introduction of quarantines and curfews in various parts of the country, the model may soon incorporate aspects related to credentialing — that is, seeing how state and local governments can allow certain people access to particular areas at particular times in order to perform necessary activities. “There are certain functions of life that will need to take place as best as we are able, and we need to make sure that they are performed as safely as possible,” he said.
Because state and local emergency managers can only control what is within their jurisdiction, much of the focus in the new model is on what emergency planners term the “last mile” — essentially, the final stops along the supply chain. Last mile analysis gives leaders the ability to effect the most change where they have influence.
Although state and local governments are doing their best to help support supply chains while still implementing recommended social distancing protocols, actually meeting the needs of the American population and preserving supply chains as much as possible will fall to the private sector. “Private sector supply chains are resilient and quick to market and assemble quickly after a disaster. Our national focus has been, and will continue to be, supporting the private sector to ensure essential goods and services are available to the public,” Pfeiffer said.
Despite the increasing threat posed by COVID-19, Pfeiffer expressed some reason for hope about the societal response. “I’m very optimistic in how we’re planning to support each other,” he said. “I’m very optimistic in the resilience of private sector supply chains. I’m very optimistic of decision-makers’ understanding how their actions have effects on the private sector.”
“But this is not a time to fall asleep at the wheel,” he added. “Everyone is simultaneously a responder and a survivor.”
Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science.
The U.S. Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science