Non-Volatile Memory Express (NVMe) is becoming the primary protocol for interconnecting modern storage technologies both within storage arrays and between storage arrays, storage networks, and...
Pure Storage has just attended SAP TechEd 2017 in Las Vegas and we came back full of excitement following all the announcements. We were also saddened to hear of the horrific tragedy that struck hours after we left. The Pure Storage family extends our warmest sympathies and prayers to the victims, their families, friends who were impacted by this awful event.
The theme of this year’s TechEd was all about Data. Whether it is Big Data, Data Tiering, Data Lifecycle, Data Volume Management, Databases, Data Hub, Data Integration, Data centers, or Data Governance and Protection.
Before diving into data. Let’s discuss other announcements and plans coming out of TechEd:
This is something that has been in the works since the inception of SCP (previously HCP, HANA Cloud Platform) few years ago. This pretty much guarantees ABAP and SAP will always go hand-in-hand. This also cements SAP’s plans to detach as much customization and development from the digital core. At the same time, it is a clear sign that custom development will continue to exist in the SAP ecosystem.
This goes hand in hand with the first announcement. Developers have a set of libraries now that will enable them to connect S/4HANA systems from SCP, then retrieve the data and process the results. This brings SCP closer to the Digital core and follows along SAP’s Cloud strategy to build more and more developer tools (APIs, Web services, SDKs) as they have been doing with Hybris, Ariba and Concur. The integration between all these application is becoming slowly becoming better.
With announcing the BETA availability of SCP on GCP, SAP continues with their multi-datacenter strategy, respecting the fact customers are no longer tied to one vendor where customers’ SAP data has to coexist with other applications whether it is based on geography, more stringent data regulation or the company’s own data center policy and staff. It is refreshing to see how SAP is still committed to this strategy and continues to release on-premises and cloud versions of its main products. New government regulations on data sovereignty threaten to complicate the delivery model that has made cloud computing attractive, presenting new concerns for companies with operations in multiple countries and making the need to control your sensitive data more and more desired. Additionally, SAP’s customers are very diverse, some require more control over their data, some need the IT knowledge retention and staff to do their work. SAP’s strategy recognizes and applies it to the main star of the show in the next annoucnment.
Last but not least, Data. SAP has announced “Data Hub”. It is SAP’s new Big Data management tool that’s intended to process only the data you actually need. With Data Hub, customers will connect to different data sources, whether they are data lakes, business warehouses (OLAP) or ERP (OLTP) systems, without requiring to pull it all into one new place. Reading this concept, it might seem Data Hub’s proposition is always leave the data in its own original location until it’s needed. However I expect Data Hub’s use cases to open new paradigm in data warehousing. For example, together with BW/4HANA and Pure Storage’s solutions, the possibilities to build a Big Data strategy is limitless, easy and cost-efficient. With FlashBlade, SAP customers can take advantage of a data platform that’s designed to consolidate multiple workloads and your Spark environment on FlashBlade to create a single version of trustworthy data. Technologies such as HANA and Hadoop living in harmony on Pure Storage systems open up new possibilities for SAP customers that can simplify their environments as gathering more data becomes a priority. In the next blogs, we will examine concepts and use cases where Data Hub, BW/4HANA and FlashBlade can come together to create a big data warehouse solution.