Traditional storage companies are living in an uncertain world. For a long time, they’ve been part of the central IT equation of compute, network and storage — the holy trinity that shall not be divided — but the traditional world looks very different these days. The rise of container technologies such as Docker and container orchestration systems like Kubernetes has changed the way that companies operate their IT systems, while the shift to DevOps has changed the way that developers and IT personnel are deployed within organizations.
How do traditional storage vendors cope with this new landscape? NetApp is typical of this world. According to IDC, the company is third in the storage market, behind Dell and HPE but, unlike these other two, NetApp doesn’t have a major computing side to the business — it really stands and falls on its storage.
The company’s 2015 acquisition of SolidFire line of solid state drive-based appliances was really about building storage platforms for the next generation — users who are not overly bothered by infrastructure.
“The majority of storage companies are already exploring this path”, said Martin Cooper, senior director, next generation data center at NetApp. “A lot of our internal development teams use these technologies. Much of our expertise came about through the acquisition of SolidFire last year.
Alex McDonald, who works in the NetApp Chief Technology Office and is also the vice-chair of storage industry association SNIA Europe, said that storage industry would cope with containers in the same way that it coped with the move to virtualization.
“Containers are a lot easier to handle because they’re physically smaller: they’re easier to manage, less overhead,” he said.
For NetApp, containers have brought a change not just in the way that products are developed but in the way they are sold as well.
But, he added, there was a particular problem when it came to handling container storage: by their very nature, containers are temporary — they had nothing to store, said McDonald and the containers were just dumped but there are occasions when applications do require storage. “The big issue with containers is providing persistent storage available in an easy-to-consume form. To get over this, vendors have to make sure that the whole process is seamless. “
For NetApp, this has not just meant a change in the way that products are developed but in a new mindset when it comes to selling too. It’s not selling to hardware specialists who have to configure devices but to developers who program them — it’s a crucial difference said Cooper. And this had entailed changes to the product set.
“We have made a lot of enhancements to the core NetApp product. These include integration with OpenStack and CloudStack as well as integration with VMware and KVM virtualization,” said Cooper.
But there are other changes too: the company has changed the way it sells the product. “We’re now offering multiple consumption choices,“ said Cooper. “We have four basic ways to consume the product: cloud, a pre-engineered infrastructure, best of breed deployment and as white box tin. And customers have a choice in how to buy those models too: whether from us or a service provider.”
The company has taken some steps to embrace the new world, said Cooper. Last December, it launched Project Trident — an open source services orchestration tool for a Kubernetes environment, it allows users to deploy Kubernetes as a way to provision applications, allowing them to consume storage resources more easily: specifically it enables the creation and deletion of storage objects such as iSCSI LUNs.
As for the future, there are still challenges ahead for storage companies. As McDonald pointed out, containerization is one thing to deal with but the rise of microservices offers a new area for exploration, the issue of persistent storage is going to be a major part of that.“How do we deal with the challenge of microservices?
For Cooper, one of the main ways that the company is embracing the future is by turning to a better use of data analytics. “There is a definite place for the using IoT and machine data, there’s also going to be a definite move to object storage and we’re going to be looking at more flexible deployment options. The use of AI and metadata will greatly improve reporting ability said Cooper, as there’d be the ability to predict problems by analyzing what’s happened.
Above all, the changes are going to be in the way that products are sold as we move away from the hardware refresh. “We understand that the guys who work in this environment don’t want the slick sales calls,” said Cooper. So there’s the launch of netapp.io, NetApp’s hub for developer centric content, including code and blogs. There’s also been the introduction of a Slack channel for developers to share experiences.
The world has changed. “Look at the way that a CIO at an investment bank will buy. He’ll be looking to buy services for his DevOps guys rather than what he’s bought in the past: some applications will never make this leap, not if they require a particular type of processor or array,” he said. “These Kubernetes-like approaches are bleeding back into the core data center and the CIOs are just not buying what they bought the year before and the year before that.”
The New Stack is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Docker.