IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image
HPE and Memory-Driven Computing: Has the future arrived?
Mon, 17th Oct 2016
FYI, this story is more than a year old

Imagine a computer with hundreds of petabytes of fast memory that remembers everything about your history, helps inform real time situational decisions, and enables you to predict, prevent, and respond to whatever the future brings.

That computer just might be here, thanks to Hewlett Packard Labs.

HPE's Memory-Driven Computing is the result of HPE's first chapter of research into memory driven computer systems.

The company showcased the concept at HPE Discover is Las Vegas last June. The idea behind it is to redesign the fundamental architecture of computing, and break away from the traditional computer model.

Paolo Faraboschi, HPE fellow and vice president, Hewlett Packard Labs, says this will have a ‘dramatic impact' in overcoming the limitations of today's technologies.

“For the past 60 years, we have been using the same computer systems and the only thing changing has been the massive amounts of data growing exponentially from our online world,” he says.

Faraboschi says by 2020, 30 billion connected devices will generate unprecedented amounts of data our legacy systems cannot keep up.

“Clearly we are being swamped with data. Back in 2007/2008 we started talking about the data deluge problem because of the number of devices per person, and because of IoT the number of devices around our environment is exploding,” says Faraboschi.

“At the same time, scaling of the memory technologies that are at the foundation of computing today will significantly slow down. We will need transformational changes to the way in which we collect, process, store, and analyse that data,” he adds.

HPE claims its Memory-Driven Computing is a new kind of computer that allows you to do things 'you can't even conceive today'.

Faraboschi, who has been with HPE for over 20 years, says the motivations behind the Memory-Driven Computing project came about were two fold.

“First, the shift in the workload and magnitude of data that we have seen the IT industry having to process. Then, on the technology side, a big shift on the technology progression that we are starting to see around server technologies.

So what exactly is it that HPE is doing?

Memory-Driven Computing enables massive data sets and radically simplified programming, with hundreds of petabytes of persistent memory. HPE's project will analyse, interact with and derive insights from all data at real-life speeds.

Memory-Driven Computing puts the data first

Memory-Driven Computing collapses the memory and storage into one vast pool of memory called universal memory. To connect the memory and processing power, HPE uses advanced photonic fabric. Using light instead of electricity is key to rapidly accessing any part of the massive memory pool while using much less energy.

“With Memory-Driven Computing, we believe we can broaden and impact technical innovations and develop new ways to extract knowledge and insights from large, complex collections of digital data with unprecedented scale and speed, allowing us to collectively help solve some of the world's most pressing technical, economic, and social challenges,” Faraboschi says.

Memory-Driven Computing is Hewlett Packard Labs' biggest project. A large team of researchers at Hewlett Packard Labs, and across Hewlett Packard Enterprise, are working to bring the concept to life.

“Hewlett Packard Labs is committed to revolutionising the computer from the ground up, enabling computers of all sizes to take a quantum leap in performance and efficiency,” the company says.

“It's all about turning all your big data into secure, actionable intelligence, using less energy and lowering costs.