AT&T Embraces Network Functions Virtualization and May Open Source its NFV Platform
AT&T released details Tuesday of a new cloud-based network functions virtualization (NFV) platform it is building out, one that will move the company from old, physical infrastructure, and toward a virtualized, IP-based network.
The telecommunications giant is also mulling the idea of releasing the NFV software it is developing, called ECOMP (Enhanced Control, Orchestration, Management, and Policy), as open source.
“Where we go from here depends on the feedback we get from the cloud and developer communities,” said AT&T Chief Strategy Officer John Donovan, speaking at the Open Network Summit 2016, an open source conference, sponsored by the Linux Foundation. “In times past, AT&T would have announced to you our plans and our path forward. Today, we want to know what you think about our vision and direction. We need to know that you’re willing to collaborate with us, to contribute your time, your effort, and your code to this initiative.”
The Lack of a Guarantee
AT&T wants a cloud-based platform for the orchestration of the high-volume processes required to run high-speed data and voice networks. To that end, in a company’s white paper posted Tuesday spelled out the objectives for its future network:
“ECOMP enables the rapid onboarding of new services (created by AT&T or 3rd parties) desired by our customers and the reduction of OpEx and CapEx [operational expenditures and capital expenditures] through its metadata-driven service design and creation platform and its real-time operational management framework – a framework that provides real-time, policy-driven automation of management functions. The metadata-driven service design and creation capabilities enable services to be defined with minimal IT development required thus contributing to reductions in CapEx. The real-time OMF [operational management framework] provides significant automation of network management functions enabling the detection and correction of problems in an automated fashion contributing to reductions in OpEx.”
Any telecommunications company would benefit from a network that is as configurable, adaptable, and scalable as a Docker-enabled data center. But real-time telecommunications workloads cannot scale the same way as most applications workloads. A telco has to scale more than just the workloads and the traffic, but the virtual components as well. Waiting for the data center virtualization market to devise a solution would not appear to be an option for AT&T, which virtualized some 5.7 percent of its network components last year, according to Donovan, yet plans to boost that percentage to 30 percent by the end of this year.
“We’re making a deliberate effort to expand our pool of suppliers beyond just the big traditional players,” the CSO told attendees, “to reduce that vendor lock-in. Now, we have no objection to working with any of these folks. But we want to bring more competition to this space in order that we speed up innovation, we lower costs, and very importantly, we reduce our cycle times.”
Donovan credited OpenStack, OPNFV, OpenDaylight, network virtualization platform OpenContrail, the Open Networking Lab, the Open Container Initiative, the Cloud Native Computing Foundation, and Facebook’s Open Compute Project with playing key roles in his company’s infrastructure transformation.
But he also made clear that the way the standards process and the open source process have worked in the past, are no longer sufficient for the needs of AT&T.
I asked Donovan if AT&T had a high watermark of sorts describing, for instance, the number of new competitors entering the open source NFV field, or the number of new contributors to the ECOMP feedback process, that would determine whether the company would become ready to commit to releasing ECOMP to the open source community.
“What we think about is, every space needs to have someone who’s innovative and disruptive,” he said. “Every space has to have someone who has experience and scale. And then what we try to do is make sure that we don’t confuse the research-intensive activities from the non-research-intensive activities, and end up getting wired backwards.”
For “an innovative disruptor that’s small and moving quickly, putting them in a research-intensive environment is a prescription for production problems,” Donovan said.
AT&T uses a scoreboard to help it quantify its needs. There’s a risk evaluation process for each instance of assessing a customer’s needs, and determining whether the virtualization of a legacy function or the replacement with a new function would fulfill that assessment, Donovan said. He refused to specify use cases but promised that he may be able to provide more quantitative analysis about developers’ opinions of ECOMP in later this year.
“I can’t tell you how much we operate as if our hair was on fire. So we know the problem, we’re solving it, and then eventually as we get together and say, how do we solve it in a multi-vendor environment, we’ll get to that,” he said. “But we’re not going to sit around and endlessly debate it now and have a theoretical problem solved a year from now. We need a practical solution.”
AT&T is building an integrated cloud environment that will let developers conduct their own integration tests in a “very advanced sandbox,” Donovan said. Integration testing is necessary to weigh the advantages of multiple alternatives.
“Standards is sometimes a dirty, muddy process. And open source is just a different version of a standards process. So this is an important summer, to see how some of that stuff plays out,” Donovan said.
Later that day, I asked AT&T Labs’ legendary distinguished architect Margaret Chiosi – the president of OPNFV, and a long-time advocate of open source in telecommunications – whether, by releasing something as important as ECOMP into open source, AT&T has weighed the risk of effectively becoming the research arm of Verizon, Orange, BT, and Telefonica.
“We’re opening up the kimono, if I can say that, to get feedback and see if the industry thinks it’s valuable,” Chiosi responded. “If it’s valuable, we are encouraging a broader ecosystem. If the ecosystem decides it’s not valuable enough for the industry, well, then, we’ll just keep doing what we’re doing.”
“Standards is sometimes a dirty, muddy process. And open source is just a different version of a standards process. So this is an important summer, to see how some of that stuff plays out”–John Donovan
But if it is valuable, then open source should push the technology beyond AT&T’s boundaries, shouldn’t it?
“The whole value proposition of open source is that, when you open source something, you hope there is a broader community that will innovate on it,” she answered. “And then the industry as a whole, as well as AT&T as a whole, gets value from that.
“We do not have the brain trust of the whole industry. And we know what we’re innovating on is based on our silo, in some sense,” she said. “So by having a broader participation, if the industry thinks it’s worth it, then you hopefully have a broader brain trust that can come up with things we might not have thought of, that in the end, we all can operationalize.”
Docker is a sponsor of The New Stack.
Photos of John Donovan of AT&T, by Scott M. Fulton III.