Cloud Native

How HPE’s ‘The Machine’ Project Could Modernize the Enterprise Data Center

25 Jul 2016 8:42am, by

Initially, data centers had perhaps 1,000 machines running at one time. As the amount of data generated by applications and processes has grown by leaps and bounds, so too has today’s data center.

“Next generation data centers are taking today’s architecture and converting it into what people call warehouse scale computing. Data centers used to be a place where machines were, now we’re beginning to think about the data center as the computer,” said Sharad Singhal, director of HPE Labs for machine applications and software.

In this episode of The New Stack Makers podcast embedded below, we spoke with Singhal to explore HPE’s project to build a super-server called “The Machine,” which the company hopes will revolutionize the enterprise data center.

With the introduction of the Internet of Things, data is now being created all over the world at a scale far beyond what the first computers scientists such as Alan Turing had created, Singhal noted. The Machine project was built on the basis of changing how the fundamental architecture of today’s computer must change. Rather than being limited by memory, The Machine project pivots from having the central processor as the core of a system with memory attached, to a new architecture where memory is moved to the center. As a result of this, The Machine’s memory is persistent.

“Once I put data in there, I can read the data, do computation as I need it, and that compute memory is available if I need it for the next computation. Inside the Machine, any compute element can reach any memory element directly, without doing I/O semantics. At that scale I can build a true computer,” Singhal explained.

HPE is a sponsor of The New Stack.

A newsletter digest of the week’s most important stories & analyses.

View / Add Comments

Please stay on topic and be respectful of others. Review our Terms of Use.