COVID-19 High Performance Computing Consortium pivots to treatment research

Summit-ORNL-preview.jpegw1200stripall.jpeg

Alongside the White House Office of Science and Technology Policy (OSTP), IBM announced in March that it would help coordinate an effort to provide hundreds of petaflops of compute to scientists researching the novel coronavirus. As part of the newly launched COVID-19 High Performance Computing (HPC) Consortium, IBM pledged to assist in evaluating proposals and provide access to resources for projects that “make the most immediate impact.”

Now, following a surge in COVID-19 cases around the world, including in the U.S., the organization’s members say the project has entered a new phase focused on delivering benefits for patients afflicted by the virus. As part of this shift, the Consortium intends to sharpen its focus on research that holds the potential to improve patient outcomes within a six-month time frame.

The Consortium’s members say the transition was partly motivated by the fact that a greater volume of COVID-19 data is now available, with more possibilities to help patients than when the group launched. Going forward, the Consortium will be particularly — though not exclusively — interested in projects focused on modeling responses to the virus using clinical datasets, validating vaccine response models from multiple clinical trials, evaluating combination therapies using repurposed molecules, and developing epidemiological algorithms driven by multi-modal datasets

The Consortium — which has 43 members, including IBM, Amazon, Microsoft, Nvidia, Intel, Google, and The U.S. Department of Energy’s Oak Ridge National Laboratory — has already received more than 175 research proposals from researchers in over 15 countries around the world. The group’s combined compute capacity has grown from 330 petaflops (330 trillion floating-point operations per second) in March to 600 petaflops today across roughly 165,000 nodes, 6.8 million processor cores, and 50,000 graphics cards. That’s up from compute capacity of 437 petaflops in May and 136,000 nodes and 5 million processor cores as of June.

Powerful computers allow researchers to undertake high volumes of calculations in epidemiology, bioinformatics, and molecular modeling, many of which would take months on traditional computing platforms (or years if done by hand). Moreover, because the computers are available in the cloud, they enable teams to collaborate from anywhere in the world. Insights generated by the experiments can help advance our understanding of key aspects of COVID-19, such as viral-human interaction, viral structure and function, small molecule design, drug repurposing, and patient trajectory and outcomes.

Approved academic and nonprofit research institutions gain free access to the Consortium’s compute resources. (Normally, a petaflop of computing power costs between $2 million and $3 million, according to IBM.)

The Consortium claims to have supported more than 91 research projects to date, including helping a team from Utah State University that simulated the dynamics of aerosols indoors and found droplets from breathing linger in the air longer than hypothesized. Michigan State University researchers used the Consortium’s compute power to screen data from about 1,600 FDA-approved drugs to see if there were combinations that could help treat COVID-19. A study from India’s Novel Techsciences analyzed plant-derived natural compounds from 55 Indian medicinal plants to identify compounds with antiviral properties that could be used against eight SARS-CoV-2 proteins. And a pair of NASA researchers are working to define risk groups by performing genome analysis on COVID-19 patients who develop acute respiratory distress syndrome.


Best practices for a successful AI Center of Excellence:

A guide for both CoEs and business units Access here


Credit: Source link