Open Source and the Cloud Native Data Center
The number of open source components inside services and applications continues to increase exponentially, and this adoption is creating a lot of change in how software is created, deployed and managed. In 2016, applications had an average of 86 open source software components.
Today, the average number of components is 528, according to the 2021 open source security and risk analysis report by Synopsys.
In this latest edition of The New Stack Makers podcast, we convened a panel to discuss the explosion of open source’s adoption and its effect on data center operations.
The guests were Mark Hinkle, co-founder and CEO of TriggerMesh, Shaun O’Meara, field chief technology officer of Mirantis; Jeremy Tanner, developer relations at Equinix and Sophia Vargas, research analyst in the open source programs office at Google.
More Choices, More Complexity
The Tinkerbell open source project, initiated by Equinix and used to deploy and manage bare metal infrastructure, is a good example of DevOps teams’ interest in improving on-premises operations at data centers. With Tinkerbell, “open source developers are able to see how their projects run on bare metal, whether that be Arm, x86 or whatever new [processors] are becoming available,” Tanner said.
Open source infrastructure is increasingly API-driven, especially compared to over 25 years ago when Hinkle started his career, he said. In the past, he said, he saw many more servers than today: “Now we look at interfaces and that makes it easy and interesting, because we’re integrating and composing infrastructure, not building it in the same way.”
Open source adoption and the number of associated choices are also creating more complexity at the data center, Vargas said.
“Now that we have so many choices, we essentially have access to all of the bleeding-edge ideas of new tooling,” she said. “And part of this sort of discussion —of choosing an opinionated stack or choosing something that is a vetted solution that piles things together — is something that you can trust someone else to support for you.”
Moving Data Back on Premises
O’Meara said more organizations are beginning to gain a more nuanced view of whether all of their data centers need to be on the public cloud.
“More companies are starting to look at their strategy and wondering if the [public cloud] is the right way to go and is the only option,” he said. “We’re seeing more companies coming to talk to us, as an example, about ‘How do we balance out our usage of our on-premises infrastructure, and then store consumer public cloud when we need to?’
“People are trying to move back on-prem and are changing the way they view the data center. So instead of the data center being the physical box that they keep everything in, they are seeing the data center as a key off point for their workloads.”