Multicloud strategies are getting some heat.
For starters, expectations are high. Multicloud promises to solve legacy IT headaches of yesteryear, cut costs, and give apps more horsepower when they need it. You can pick and choose resources à la carte. Managed service providers (MSPs) will take tuning and tooling off your plate.
In reality, it’s delivering on that and more. According to a report by Flexera, 89% of organizations in the survey have adopted multicloud strategies, and they expect cloud spend to grow by 29% in the next year.
But what the cloud gives, it also can take away—and rapid adoption can lead to chaos and complexity that companies are eager to solve. In blowing past cloud budgets (by an average of 13% according to the study by Flexera), IT organizations are allocating for maximum requirements, not actual requirements. What’s left over is waste, and it adds up—to the tune of over $14 billion last year alone. Respondents self-estimated 32% of cloud spend is wasted.
As a result, 59% of organizations in Flexera’s survey are snapping to and prioritizing multicloud optimization for the sixth year in a row. (Right behind that initiative is—you guessed it—plans to increase cloud use and migrate even more workloads.)
If organizations are still struggling to get waste and spend under control, how do they rein it in before adding even more complexity?
Embrace Fluidity and Change in the Cloud, but Keep Your Eye on the Ball
There’s a slight problem with calling multicloud adoption a “journey.” It implies that once you get there, you’ve arrived—but it’s not one-and-done. To head off waste and keep budgets in check, expect change, and design an infrastructure that can support that flexibility.
If you’ve had a more piecemeal, de facto adoption of multicloud so far, take time to pause and do an audit. It can be helpful to assess:
- The nature of your workloads in the cloud. Workloads are most easily broken down into two types: static and dynamic. They’ll require different types of storage, services, and runtimes. Which ones actually need to run 24×7? Which can be safely stored in a data lake or solution that doesn’t require super low latency or real-time processing?
- Your overall capacity requirements. Leave a buffer on either end so you’re not in a pinch during a burst and aren’t leaving too much idle.
- The licenses and resources you’re paying for vs. actually using. You can revisit this as workloads change. And if you notice you’re acquiring new services faster than you can dial them in, you might be on the path to creating waste.
- The resources that have auto-scaling. Do you have the ability to spin them down when requirements change? This way when you move workloads, you’re not paying for services you no longer need.
- The apps and data that need to be mobile, shared, and freed from silos. It all comes down to being able to move, access, and share data. With better data mobility, it can be easier to move workloads between public and private environments. Watch for bottlenecks here as some legacy storage solutions haven’t quite caught up to the cloud’s pace of change.
Bottom line: Regularly re-evaluate your multicloud strategy from a high level. Workloads will ebb and flow, and some may evolve to be better suited on premises. Here’s where interoperability can be a game-changer and why having the data mobility and stability to support that change is critical.
Opt for Flexible Consumption Models You Can Purchase Like Utilities vs. Fixed Subscriptions
Monthly subscriptions for services that aren’t being used at all, or to their maximum potential, are major overspend offenders. The easiest way to safeguard budgets when workloads are bound to fluctuate is with flexible consumption models that allow you to pay for only what you use.
As-a-service models are the ticket to solving one of IT decision-makers’ top concerns: accurate capacity planning.
Case in point: Evergreen//One™ uses a flexible consumption model for data storage, which allows you to treat storage like a utility. You pay for what you use, not a fixed amount of storage you may overuse or underuse. For multicloud environments, in particular, this has a major advantage. It reduces complexity between different providers, on premises and off, with one unified subscription to manage multiple cloud services at once.
Hybrid cloud has never been more streamlined.
Use Advanced Analytics and AI-powered Monitoring for Better Visibility
The best way to get ahead of bloat is with transparency in your cloud environments—a lack of which is another top pain point in data centers. Here’s where as-a-service models like Evergreen//One can provide portals with metering and monitoring capabilities so you’re getting a clear view of your costs and usage.
Also, effective IT analytics and reporting can go further to help you accurately forecast and prevent waste. That workload running 24×7 and causing costs to skyrocket? You’ll know about it sooner with better analytics.
“Monitoring provides real-time insight into the impact of performance degradation on customers, but increased multicloud adoption presents monitoring challenges.” –2019 Gartner Market Guide for IT Infrastructure Monitoring Tools
Yet IT analytics can be challenging, especially if you’ve adopted more à la carte cloud services.
What’s the solution? Governance is a must, but it can be time-consuming when done manually. IT infrastructure management (ITIM) tools, spend management tools, and cloud-native monitoring can all help you get closer to an optimized cloud. But again, you’re staring down the barrel of complexity.
Automation, artificial intelligence (AI), and API-based tools can make this significantly easier. AI can spot anomalies, predict trends in your capacity requirements, and anticipate waste before it makes its way to the balance sheet. Pure1®, our AI-based monitoring platform, can detect trends that indicate when you need more or less storage capacity and alerts us to issues before they occur.
Focus on App Portability, Data Mobility, and Interoperability
A big allure of the multicloud model is avoiding lock-in, but it turns out there are still trade-offs. Mainly, where and how you choose to store your data can lead to compromises that diminish your multicloud ROI. Ironically, better on-premises storage can be a great place to start.
Why? Cloud vendor lock-in and silos can come back into play at the data layer. Add in platform-specific APIs and a diverse multicloud environment can quickly resemble the legacy environment you came from. Storing duplicate data sets on multiple clouds sounds like a solution but can lead to compliance and governance complexities you don’t want.
Here’s where designing an infrastructure for seamless multicloud mobility is critical. A solution like Evergreen//One helps organizations reduce data gravity in the cloud with a single pane of glass between providers, both on premises and in the cloud. It can also remove the often cost-prohibitive task of moving data between clouds.
Consolidate Data’s Center of Gravity
Going hybrid with a mix of public- and private-cloud environments is one of the smartest investments an organization can make, and it doesn’t have to be a chaotic cost center. Getting a handle on analytics and treating services like utilities are key. But don’t underestimate your underlying technology.
Real clouds may defy gravity, but data doesn’t. A great cloud strategy has a primary center of gravity: a unified, modern data solution that consolidates the complexities that lead to waste.
With a platform like Pure, you get one unified subscription across public and on-premises clouds so you can do what you want when you want. That alone can help cut unnecessary costs and conquer waste—all without cutting critical, strategic resources that can drive your business forward.
If you do look at multicloud adoption as a journey, getting your data in order should be the first step. And Pure is here to help.
Learn more about how Pure can help you conquer the cloud divide.