This post was originally published on this siteI have blogged a decent amount recently about VVols and in many of those posts I mention config VVols. When using ...
I recently read the 21st annual internet trends report authored by Mary Meeker, a highly regarded venture capital partner at the firm Kleiner Perkins Caufield & Byers.
In the report, Mary analyzed several technology trends, including trends related to data growth between the years 1990-2016. In the report, Mary identifies three distinct waves of consumption of data and suggests we are entering the early stages of the “third wave” of data.
Here’s how Meeker predicts this next wave of data will take effect:
“Next Big Wave: Leveraging this unlimited connectivity and storage to collect, aggregate, correlate and interpret all of this data to improve people’s lives and enable enterprises to operate more efficiently.”
I agree with this observation in general, but I believe that the transition is far more profound than the report suggests. I believe we are amidst a transition that goes far beyond just merely improving lives and making enterprises operate efficiently.
What I observe in the early stages of the third wave is a rapid transition towards using data to actively create the products and services that companies offer. Increasingly, we are approaching a time in which true innovation cannot exist without data and advanced data analytics. I believe we are entering a phase in which products, services and insights are born digital, analyzed digitally and consumed digitally. Tools and techniques like machine learning, analytics and simulation are not just shaping how the internet is evolving, or how corporate dashboards are established. Advances in science and engineering have changed the way we do a number of things across a multitude of industries, ranging as wide ranging as automotive, home automation, shipping, finance and a number of other fields that touch everyday, human activities.
These techniques share a couple of common characteristics. Perhaps the most important point is that these data sets grow at a much faster pace than the transactional workloads of the past, in which corporate IT spent significant resources to the management of the business record. In this new landscape, the volume and velocity of data is increasing so fast that even Moore’s Law can’t keep up. This has led to new computational paradigms – within distributed systems, and within technology companies – which increasingly use more specialized CPUs to process and analyze data.
Early entrants in this space include NVIDIA, with GPU processors rapidly becoming deployed into machine learning environments and Intel with its Programmable Systems Group making similar inroads with FPGA based processing technologies. There are also other emerging entrants, including Google with its proprietary Tensor Processor Unit (TPU), which Google uses to accelerate its in-house machine learning library TensorFlow.
An often overlooked side effect is that new storage paradigms are emerging – and, as Mary highlights in the report, those paradigms are built to be both Big and Fast (and I’d like to add Simple to that list).
FlashBlade was built for this paradigm shift. We set our sights to build a product that is Big, Fast and Simple. Space and energy are scarce resources in the data center, so we designed a product that is incredibly efficient. We knew that the world wouldn’t change overnight, so we paid close attention to ensure compatibility with current technologies and with existing architectures in the data center. But fundamentally, we had to build a storage system that worked and scaled really well, and which would work well in usage patterns that are large, random and unpredictable.
Let me offer this suggestion to expand Mary’s prediction:
Technology is in the early stages of an IT transformation in which technology moves beyond driving productivity enhancements –technology has entered the realm of being core to the creation of the business and its value. The third wave of data is motivated by access to Fast and Big data, by applications processing sensor and otherwise digitally generated data, and using computing platforms that are of size-scale significantly larger than what Moore’s law would have predicted.