Data Center Management: Self-Healing Shifts to Self Optimization
Rule of thumb: Don’t ask survey questions about long-term intentions. It is hard enough knowing what will happen six months from now, let alone in six years. Yet, rules are made to be broken. We are not writing about Vertiv’s “Data Center 2025: Closer to the Edge” report because it accurately predicts the future. Instead, the responses from over 800 data center professionals provide a glimpse of what 2025 may look like as well as examines how buzzwords like “self-healing” have fared and illuminate new trends like “edge computing.”
Survey respondents have decided that data center management and control is moving towards a self-optimizing future. When the longitudinal study was first conducted in 2014, 43% indicated “self-healing” would be a key to the industry in 2025. Flash forward five years and less than half share that opinion. While the terms both refer to automation, there are differences. Self-healing systems can detect and resolve problems automatically. In contrast, self-optimization improves performance or reduces costs by, for example, routing a workload to a different cloud provider. Perhaps hands-on experience with vendor-promoted, self-healing technology has created skepticism. Just like in the burgeoning market for AIOps, it is more realistic to incrementally optimize performance by using automation tools that may or may not utilize artificial intelligence.
There was also a shift to more realistic expectations regarding IT utilization, with significantly fewer data center professionals expecting average utilization at the network core to be extremely high or low. Five years ago, server virtualization was driving increased utilization rates. Now, on-demand cloud computing and containerization are catalysts for higher utilization. These advances have made variability in resource utilization less of an issue. In fact, there are countervailing forces such as edge computing and an emphasis on performance that are making redundant resources desirable in some cases. In addition, OpEx’s more prominent place in IT budgets means people are generally not as fixated on maximizing their CapEx technology investments.
According to Vertiv’s survey, high bandwidth will be the primary data requirement for edge applications in 2025. Many, but not all, edge use cases have high-performance requirements. For this reason, architects may increase their willingness to err on the side of caution and overbuild IT infrastructure. For other perspectives on the use cases most likely to need edge computing, we recommend looking at MobiledgeX’s Navigator. These interactive tools let you adjust how four different factors will impact the relative uptake of different use cases.
A Final Aside on Methodology
A combination of observed data points, end-user surveys, and expert interviews is the best way to understand and analyze technology trends. Combined with an understanding of economics and market cyclicality, industry analysts can effectively guide both product manager and investor decisions. But, modesty is probably the most important predictor of a forecaster’s success. In fact, multiple quantitative studies of pundits and public policy experts have found that the most accurate prognosticators are the ones who regularly say they don’t know what the future will bring. The best forecasters temper their predictions and are willing to quickly change their minds. Unfortunately, as Daniel Drezner explains in “The Ideas Industry,” there are too many incentives for “thought leaders” to proselytize their vision of the next world-changing idea.
Feature image via Pixabay.