COVID-19 High Performance Computing Consortium pivots to treatment research
November 16, 2020
138         0

by admin


Alongside the White House Office of Science and Technology Policy (OSTP), IBM announced in March that it would help coordinate an effort to provide hundreds of petaflops of compute to scientists researching the coronavirus. As part of the newly launched COVID-19 High Performance Computing (HPC) Consortium, IBM pledged to assist in evaluating proposals and to provide access to resources for projects that “make the most immediate impact.”

Now, following a surge in coronavirus cases around the world, including the U.S., the organization’s members say the project has entered into a new phase focused on delivering benefits for patients afflicted by the virus. In this new phase, the Consortium intends to sharpen its focus on research that holds the potential to improve patient outcomes within a six-month timeframe.

The transition, the Consortium’s members say, was motivated partly by the fact that there’s a now greater volume of COVID-19 data available, creating more possibilities to help patients than when the Consortium launched. Going forward, the Consortium will be particularly — though not exclusively — interested in projects focused on modeling responses to the virus using clinical datasets, validating vaccine response models from multiple clinical trials, evaluating combination therapies using repurposed molecules, and developing epidemiological algorithms driven by multi-modal datasets

To date, the Consortium — which has 43 members like IBM, Amazon, Microsoft, Nvidia, Intel, Google, and The US Department of Energy’s Oak Ridge National Laboratory — has received more than 175 research proposals from researchers in more than 15 countries around the world. Its combined compute capacity has grown from 330 petaflops (330 trillion floating-point operations per second) in March to 600 petaflops today across roughly 165,000 nodes, 6.8 million processor cores, and 50,000 graphics cards. That’s up from 136,000 nodes and 5 million processor cores as of June; in May, compute capacity stood at 437 petaflops.

Powerful computers allow researchers to undertake high volumes of calculations in epidemiology, bioinformatics, and molecular modeling, many of which would take months on traditional computing platforms (or years if done by hand). Moreover, because the computers are available in the cloud, they enable teams to collaborate from anywhere in the world. Insights generated by the experiments can help advance our understanding of key aspects of COVID-19 such as viral-human interaction, viral structure and function, small molecule design, drug repurposing, and patient trajectory and outcomes.

Approved academic and nonprofit research institutions gain free access to the Consortium’s compute resources. Normally, a petaflop of computing power costs between $2 million and $3 million, according to IBM.

The Consortium claims to have helped more than 91 research projects to date, including a team from Utah State University that simulated the dynamics of aerosols indoors and found droplets from breathing linger in the air longer than hypothesized. Michigan State University researchers used the Consortium’s compute power to screen data from about 1,600 FDA-approved drugs to see if there were combinations that could help treat COVID-19. A study from India’s Novel Techsciences analyzed plant-derived natural compounds from 55 Indian medicinal plants to identify compounds with anti-viral properties that could be used against eight SARS-CoV-2 proteins. And a pair of NASA researchers are working to define risk groups by performing genome analysis on COVID-19 patients who develop acute respiratory distress syndrome.


Best practices for a successful AI Center of Excellence:

A guide for both CoEs and business units Access here


subscribe for YouMedia Newsletter
0 comments

Leave a Reply

Your email address will not be published. Required fields are marked *

newsletter
subscribe for YouMedia Newsletter
LET'S HANG OUT ON SOCIAL