COVID-19 arrived and much of the world came to a standstill. The biopharma industry, however, worked at breakneck speed to deliver not just one, but three approved vaccines in less than a year. How did it get there? Years of prior research, unprecedented levels of funding, and international collaboration.
Oh, and one more thing: massive amounts of data and the power of artificial intelligence (AI).
AI played a huge role in the race for a COVID-19 vaccine, starting with studying the proteins that make up the virus. Initiatives like Folding@Home and Google’s DeepMind applied neural networks to predict the 3-D shape of the proteins based on the virus’s genetic sequence. Using AI, scientists identified components that are unlikely to mutate, which helps ensure that a vaccine will remain effective over time.
Along with identifying vaccine targets, AI played a role in finding therapies to treat COVID-19. Screening thousands of existing drugs against viral proteins was simply a feat too massive for human experts. BenevolentAI and Insilico Medicine, for example, used machine learning methods to identify potential options.
And as the virus mutates, scientists are using AI to track and predict changes in its genomic sequence to help them stay one step ahead.
The successful application of AI requires three things:
- loads of data
- access to that data
- infrastructure that can support AI’s fast, iterative computation
Meet the Demands of Rapid Data Growth
Last year was a trial by fire for the life-sciences community. The technical advances that have been useful in this race are becoming the new normal. As R&D becomes more patient-centric, the quest for personalized medicine will rely on more data from technologies such as imaging, high-resolution biomarker testing, and DNA sequencing. Teams distributed across research centers, pharmaceutical, and medical device companies will need to leverage petabytes of data.
But mining data for timely insights is easier said than done. And a solid IT foundation is essential. For most life-sciences organizations, trying to keep up with rapid data growth has led to haphazard, often disconnected, data storage systems. Although public cloud adoption is high, most life sciences companies have data distributed across on-premises, private cloud, and multicloud environments. These silos are often unavoidable due to regulatory requirements. As a result, it can take a long time to integrate data, and researchers end up spending more time engineering their data than drawing insights from it. Solutions that offer ease of data mobility and integration are paramount.
The move toward data-driven precision medicine not only demands seamless data access, but also superior analytics to enable faster therapeutics, diagnostics, and vaccine pipelines. Getting to market quickly requires AI-powered drug discovery, genomics and imaging analytics, and clinical-trial planning and execution based on learnings from historical data. Applications of AI and ML during the pandemic have shown that can happen. But few organizations have infrastructure that can scale to meet the storage, networking, and compute demands of AI workloads.
Don’t Save Infrastructure for Last
Infrastructure may not be top of mind, but it definitely shouldn’t be lowest. Harnessing data effectively requires you need a holistic data management strategy that includes the bottom tier of underlying infrastructure to improve application performance. High-performing infrastructure allows your research teams to enjoy consistent access to vast amounts of data, scale storage capacity on-demand, and run compute-intensive analytics without breaking everything. It also keeps your data backed up, protected, and secure. Plus, it enables IT leaders to seamlessly integrate their on-prem and cloud strategies without compromise.
In the race to fight disease, big data and AI might just be the panacea we need. But to be data-ready and harness the power of AI, you need a data infrastructure that allows you to do what you do best—innovate.