HP Enterprise Chief Architect: Computing’s New Dawn Beyond the Twilight of Moore’s Law

No doubt you’ve heard of Moore’s Law. The premise put forth by Gordon Moore states that the number of transistors per square inch on integrated chips double every year. That’s held true over decades, and now the length of time is about one and a half years.

The trend is slowing inexorably, indicating that the speed of those processors will not double at that 18-month rate. The chips, operating with full loads, so to speak, will tap out and top out.

So, in a post-Moore’s Law world: What lies beyond? Cloud computing has its limits, as it must grapple with the intricacies and minutiae of thousands of simultaneous processes that mark a firm’s operations. There’s a yawning chasm at times between the promise and performance of cloud computing.

In an interview with PYMNTS’ Karen Webster, Kirk Bresniker, chief architect of Hewlett Packard Labs with HPE, said a gap is widening between processing power and the needs of a data-driven world. The tailwind provided by Moore’s law is diminishing, so hybrid cloud computing becomes especially important.

“We are in the twilight of that Moore’s law scaling,” he said. Now the stage is set for HPE to examine what technologies are there that can continue to scale. “And every year we create as much information as mankind has ever recorded — from cave paintings to now.”

The demands of new systems, until recently only sheer figments of imagination — from autonomous vehicles to 5G communications — means there are only microseconds in place for computers, companies, data-driven analytics and sometimes humans to make decisions.

The key, then, is to move data collection, analysis and decision-making away from central data centers and more toward the edge of where activity happens, at the point where sensors and consumers interact.

To use artificial intelligence against a variety of use cases means that businesses can scale, efficiently, without necessarily running hardware and components at maximum loads even as data demands increase.

Bresniker noted that the confluence of innovation and real-world application takes time. The gestation period can be several years, he said.

“There has to be some opportunity that is either underserved or unarticulated that your ingenuity and that opportunity can conceive. And then you do need that support — you need that business leadership that understands … your opportunity — [coupled with] finite resources [and] in finite time, with rewards that outweigh risk.”

In an age where memory is scarce and computing is ever cheaper, Bresniker said he spins that reality on its head to get companies to see computing and analytics in a new light. Namely: What if constraints were in place forever? How would companies reimagine architecture and what they get out of it?

Where once companies spent money to save money and to reduce costs, IT — operating across deep learning structures — it can be harnessed to create economic improvements.

“The IT team can take a seat at the big table and say I am not here as an expense center; I am here as a profit center because the data is becoming so valuable,” he told Webster. The data is more valuable than the process. And many companies, he said, throw away valuable data.

“Almost all of the data that the enterprise creates can be turned into economic activity by running it through an information lifecycle,” he explained, if only it were moved and analyzed beyond the data center.

Bresniker posited that across the next decade, three of four bytes created by the enterprise may never reside in a data center.

Against that backdrop, HPE has launched several initiatives aimed at boosting deep learning at the enterprise level, improving the way data is used, including OneSphere.

HPE OneSphere, now being made generally available, serves as a Software-as-a-Service-based “hybrid” solution that focuses on managing the cloud both on-site for a company and in the public cloud.

The hybrid nature of the offering allows companies to build and deploy virtual machines, shortening the time to develop applications and analyze power consumption and cost analytics as systems are being run through what it terms a “unified view.”

In one use case, he offered the example where the German Center for Neurodegenerative Diseases harnessed such technology to find that genomic sequence analysis — in a study focused on Alzheimer’s disease across 30,000 patients and 30 years of data — could be shortened from 23 minutes to 13 seconds per analysis.

“What this does is change the way we do research,” he told Webster. “If I gave you something that was 10,000 times faster [for analyzing data], would you change your business process to match, or would you change the business you’re in?”

“You can either be one of these hyper-competitive enterprises that are driven by real-time data analytics or machine learning, analyzing relentlessly every aspect of your business either to improve productivity to improve customer experience,” he told Webster, “or you’ll be desperately wondering how to compete against them.”