DevOps / Storage / Sponsored

Four Companies Set the Stage for Programmable Infrastructure

19 Apr 2016 12:54pm, by

As the world of application development has evolved, so too have the infrastructures used to support the storage and networks demands of today’s enterprises. So it is not surprising that there is increasing buzz around programmable infrastructure, an emerging method in which the automated practices and tools of DevOps are being extended into managing system operations as well.

At the Intel Cloud Day 2016 event, last month, The New Stack founder Alex Williams and TNS reporter Scott M. Fulton III met with a variety of executives of companies engaging in the programmable infrastructure space, to learn more about the role the practice could play in the data center. The interviews have been captured as a series of podcasts, embedded below.

Paul Turner, Cloudian

In his interview, Paul Turner, chief marketing officer for storage company Cloudian, highlighted the many changes software defined infrastructure has brought to the DevOps community. In particular, he noted the ability for DevOps teams to control their entire development ecosystem from within an individual application. Turner notes that Amazon is setting an impressive lead in this area, establishing its Simple Storage Store (S3) as one of the leading cloud storage service available. Cloudian has implemented its fully compliant Amazon S3 interface through which developers can build new applications or drop it into their existing infrastructure with ease.

“The infrastructure of the future is going to be software defined.”

Cloudian’s goal is to bring the storage capabilities of S3 to a broader series of ecosystems. With the increased storage, computing and resource capacity today’s cloud services offer, distributed data infrastructure has seen its rapid revolution. At Cloudian, mathematical protection, parody bits, and RAID-liked technology help to protect user data across many servers. Turner explained that by utilizing this approach to distributed data storage users are completely protected against data loss, even if multiple servers fail.

Ultimately, “Our goal is agnostic storage,” Turner said. Contrary to the static legacy storage of previous years, cloud storage changes one’s entire interface to become an open network based and web-based setup, Turner noted. “The infrastructure of the future is going to be software defined. You’ll have a networking layer, server-quality service layer, all stitched together by application. That is what DevOps guys are going to build,” he said.

Garry Olah, Coho Data

Garry Olah, Coho Data vice president of business development, knows that developers often expect storage to scale automatically, without having to think about it. However, he notes that sometimes this just doesn’t happen the way a developer might have planned. The missing key, he mentioned, is intelligence to the storage platform itself. “Hyper-convergence is interesting, but there is more intelligence to be exploited in the storage layer,” said Olah.

“Our vision is to turn the data center into a strategic, functioning business,” Olah noted. Often, when enterprise IT teams purchase traditional storage, they get caught in a Catch-22. In traditional storage, there is often over-provisioning taking place from the start. Olah explained that provisioning months in advance means that companies will often buy more storage than they need. “True scale out happens when you buy what you need when you need it, you should be able to throw in another 45TB, and it should be running without you knowing it’s there,” he said.

Olah went on to explain that the workloads of the future are likely going to be centered around OpenStack. He also explored the future of the public cloud and hybrid infrastructure setups. Overall, Olah notes these changes in infrastructure will force private cloud providers to perform more like the public cloud in terms of their solutions to the challenges their customers are facing.

Darrell Jordan-Smith, Red Hat

When one hears, “Red Hat,” one usually doesn’t think of telecommunications market. However, that’s exactly the market Red Hat is gunning after, according to Darrell Jordan-Smith, Red Hat’s vice president of worldwide service provider sales.

“My real job at Red Hat is to help telecommunications transition from where they’ve been traditionally, in terms of a very propriety-based environment, to an open platform, open systems, open source environment, Jordan-Smith said.

By helping enterprise telco organizations migrate from proprietary solutions to an open source structure, Jordan-Smith hopes to shift the infrastructure dynamic surrounding the industry. “Traditionally, because of regulations, these companies have depended on proprietary platforms in order to deliver solutions in a very linear way. [The] Cloud changes that entire ecosystem and dynamic,” said Jordan-Smith.

Red Hat is currently partnering with Nokia, Cisco, Juniper and software defined networking specialists to provide integration at the networking layer for telecommunications firms and enterprises just starting out in re-defining their infrastructure. Implementing infrastructure orchestration helps telco companies to measure better their quality of service, as high-availability load balancing, resource abstraction, and scalability all offer significant benefits.

Red Hat remains at its core an open source company. “Our role within AT&T, China Mobile, and Verizon is to coach around how they create a community around their projects,” said Jordan-Smith.

Tarkan Maner, Nexenta

Tarkan Maner,  chairman and CEO at Nexenta, wants to help users not only to store their data in whatever cloud suits them but hopes to offer efficiency and affordability in an area which has long been known for its high costs. “Intel provides [the] infrastructure for Nexenta to help developers make software work at a never before seen cost on commodity standard infrastructure,” said Maner.

Maner estimates that over 60 percent of IT costs are tied to data storage in some way, which Nexenta hopes to help its customers cut costs on. Whereas a TB of data is $19.99 in the public sector, many businesses are paying much more for adding the same amount of data.“Enterprises are paying $1,000 per TB for mafia storage systems, on a standard Seagate drive in a legacy hardware enclosure. These vendors created industry and inhibited innovation,” Maner explained.

When it comes to programmable infrastructure, Maner explained that the key focus is the ability to automate one’s workload as much as it is on providing customers with quality service. “Programmable infrastructure is about repeatability. How you deploy, test, and maintain systems, it’s a full 360-degree cycle. Companies want programmability and standardization, but they also want smart, educated, trained channel partners who learn and teach, that can help customers when something goes wrong,” Maner said.

Intel is a sponsor of The New Stack.

Feature Image: The Broad, Los Angeles, taken by Julien Moreau.

A newsletter digest of the week’s most important stories & analyses.

View / Add Comments

Please stay on topic and be respectful of others. Review our Terms of Use.