Technology /

Virtual Instruments, a Service Developers Use to Correlate App Development and Infrastructure

3 Aug 2015 9:26pm, by

An interview with Donnie Berkholz, director of Development, DevOps, and IT Ops at 451 Research, was added to the end of this story less than a day after it was published.

VirtualWisdom, from Virtual Instruments, uses a combination of agentless hardware and software “probes” to collect data about what’s happening inside virtual machines, network switches, physical servers and storage systems. It correlates the data in real time, then uses an analytics engine on top to provide “context-aware” analytics to provide specific advice about scenarios that are common problems in the data center.

Screen Shot 2015-08-03 at 4.07.10 PM

Now the company has added support for Microsoft Hyper-V and IBM PowerVM. It previously was limited to VMware’s vSphere, said Barry Cooks, senior vice president of products, engineering and support.

The data center is more complex than ever. Virtual Instruments, which competes with the likes of Cisco and Brocade, uses software to make recommendations, helping ops teams manage the complexities, especially as distributed infrastructures become more widely used by enterprise operations. For example, as part of its new release, the company is adding a new analytic called the VM coordinator that can look at each VM in the cluster and make specific recommendations, Cooks said.

VirtualWisdom “doesn’t just tell you something is wrong, but instead tells you how to fix it,” Simon Robinson, research vice president at 451 Research says in the release announcement.

Former Symantec Corp. Chief John W. Thompson heads the San Jose, California-based company, founded in 2008. Its customers cross a range of industry verticals, including NASA, AT&T, Siemens, Microsoft and VMware.

“It not only says move VM A from host X to host Y, but if you do that, we expect to see the following improvement in CPU utilization, your memory utilization and IO utilization — across the entire cluster,” Cooks said. This capability will be available only for vSphere in this release, but will be coming for Hyper-V and PowerVM. That’s because its data scientists list to “train” the algorithm with real-world data, which they’re still in the process of doing, Cooks said.

“This is more of our move into predictive analytics to drive better future behaviors for workloads,” he said.

Developers use the service in app deployment, he said, to better understand what the application actually does to the infrastructure on which it’s residing. The data can be used in test and development to validate the behaviors developers want to see in production beforehand. Database teams also can use the data to tell whether a change they’re about to roll out is going to cause a change in I/O pattern.

“It can catch mistakes in rollout where they were dropping indices and causing table scans. Our depth of collection of data is such that we understand rewrite patterns, random versus sequential patterns, IO size changes — all those things we capture and can report back on or even alarm on. It allows the application teams to have a much higher degree of confidence in what they’re about to roll out. If it’s something they do roll out, it allows them to react much more quickly to what can be very complicated multi-tier applications in the data center,” he said.

While it uses an appliance for hardware- and software-based data collection, it has opened a beta on a cloud model to ease on-boarding of new customers. It also has betas under way for data performance for network-attached storage (NAS) and Fibre Channel over Ethernet (FCoE) storage protocols.

VMs are not going away anytime soon, so supporting them remains an enormous problem and market opportunity for vendors in the space, according to Donnie Berkholz, director of Development, DevOps, and IT Ops at 451 Research.

“While Docker has grown incredibly in popularity, some of the earliest adopters in production have begun to realize it’s not the solution for all problems,” he said.

“What’s happening now is that use cases are slowly sieving themselves out to either Docker containers or VMs, and we’ll have a much better sense over the next year of when to use what.”

A huge number of enterprises are still not using Docker at all. The most recent data at 451 Research shows that roughly 20 percent have containers in proof of concept or production, leaving 4 out of 5 that don’t.

“That’s a significant group to tap into, particularly for vendors that are able to guide them along the full adoption path, whether it ends up at VMs or containers,” he said.

Docker, IBM and VMware are sponsors of The New Stack.

Feature image: “Coordinated Effort” by Garry Knight is licensed under CC BY 2.0.


A digest of the week’s most important stories & analyses.

View / Add Comments