Two very different types of intelligence have been dominating the headlines these days: artificial intelligence (AI) and extraterrestrial intelligence (EI). One is practical; the other is existential. But despite their differences, both are, at their core, a search for answers. 

And, as it turns out, the answers could lie in analytics, data, and how that data is managed and stored—from deep-space images and physics experiments to electromagnetic radio waves and even sonified data we can hear. Let’s explore a few of the ways we’re attempting to unlock secrets of the universe with data.

Explaining the Big Bang with Big Data

Data has been used for decades in our attempts to determine the physics of our universe: the Big Bang Theory, antimatter, black holes, and other dimensions. 

CERN, the European Organization for Nuclear Research, is a leader in nuclear particle experiments linked to the Big Bang. CERN’s Large Hadron Collider (LHC), the world’s most powerful particle accelerator, just may be the key to understanding the origins of our universe. In the search for “new phenomena,” complex simulations and models are used along with unsupervised machine learning techniques and neural networks trained for anomaly detection.

Within one year, when the LHC is running, more than one exabyte (the equivalent to 1,000 petabytes) of data is being accessed (read or written). – CERN

In the process, LHC creates immense amounts of data for analysis. In fact, when particles collide in the accelerator at approximately 1 billion times per second, it can generate about one petabyte of collision data per second

How do exabytes of data become insights into the origins of our universe? To start, heavy data filtration in the CERN Data Centre. This massive data center is the heart of its operations, including scientific data management and the distributed computing infrastructure behind the LHC. (And, more recently, CERN’s Quantum Computing Initiative.) Then, filtered and reconstructed data is copied to data centers around the world and distributed across a massive computing grid so thousands of analysts can collaborate on the LHC’s experiments. 

This data will only increase over time as CERN’s technology improves. The LHC’s successor, the High-Luminosity LHC, is planned for 2027 and estimated to create exabytes of data per year, leading the organization to begin preparing now for these and future data challenges. 

Using AI to Locate EI

Since launching in 2021, the James Webb Space Telescope (JWST) has given us an unprecedented look at our universe. But just last week, the JWST shared progress toward more ambitious objectives than just stunning photos. 

As it searches for answers to our universe’s origins and furthest reaches, the JWST could also analyze atmospheres for biosignatures and technosignatures—in other words, signs of life and advanced civilizations. That means deep analysis of deep-space photos—and lots of data. (When it was launched, the JWST was expected to increase NASA’s data collection needs five-fold, leading the organization to turn to Pure Storage for its data infrastructure.)

When researchers turned the JWST back on our own planet to test its capabilities in long-range atmospheric analysis, the results were promising. First, the team decreased the image quality to simulate light years’ distance. Then, they applied an advanced computer model to mimic the telescope’s sensors to see if Earth’s biosignatures—oxygen, methane, nitrogen dioxide, and more—were detectable. 

And, they were! While these initial results are not a guarantee, it’s an exciting possibility that we could soon analyze remote atmospheres 40 light years away, no matter what they might contain. 

Shining a Light on Dark Matter with GPS Data

Dark matter lives up to its name as one of the more elusive elements of the universe, but rumblings of the breakthrough use of existing GPS satellites with on-board atomic clocks could yield real answers—finally. 

We already have a halo of atomic clocks onboard GPS satellites circling our planet, providing “hyper-accurate timing signals” that could, theoretically, pick up dark matter passing through in the form of “glitches” that would affect the atomic processes of these clocks. When that occurs, they’d be thrown off by tiny amounts—something AI-powered anomaly detection could help to identify.

It remains to be seen, but perhaps tapping these existing satellites as a “huge detector” along with anomaly detection of the data could prove a breakthrough purpose they weren’t originally intended for. Stay tuned

Dusting for Digital Fingerprints in the Desert

In the remote Atacama Desert in Chile, where light pollution is at an absolute minimum, one of the most powerful astronomical instruments in the world, the Atacama Large Millimeter Array (ALMA), sees beyond the earth’s atmosphere. This giant radio telescope looks deep into the universe to capture gamma and radio waves 24-7, creating hundreds of terabytes of data a year as it gathers energy emitted from the furthest reaches of the “cold” universe. These “digital fingerprints” are key to unlocking the how and what of molecules that may explain our origins.

Thanks to ALMA data, the astronomical community has published more than 2,500 papers on galaxy formation, the formation and death of stars, and black holes. 

How AI and Pure Storage Can Help Accelerate Cosmic Discovery

“In the non-too-distant future, we’re going to need computing storage capabilities outside of Earth—on the Moon or Mars. We’re going to need capabilities like Pure can deliver.” –Ron Thompson, Chief Data Officer and Deputy Digital Transformation Officer, NASA

For a data storage company like us, it’s easy to get excited about the possibilities of data. It’s human curiosity and innovation in digital form. But like the universe, this data set is huge. Perhaps our best shot at finding answers in the expanse is with the speed, power, and scale of AI. 

That’s where Pure Storage comes in. Pure Storage is uniquely capable of powering interplanetary discoveries like these without putting the planet we live on at risk. With our highly efficient data storage platform, organizations don’t have to compromise performance for simplicity and sustainability. From data ingest to analysis, data lakes to applications, every step of the data lifecycle on Pure Storage is optimized for AI.

Keep an eye on the sky and contact us to learn more.