10.05.2017 Views

PC_Advisor_Issue_264_July_2017

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

News: Analysis<br />

It’s time to dump Moore’s Law to<br />

advance computing, researcher says<br />

An end to Moore’s Law will prompt chipmakers to think outside the box, reveals Agam Shah<br />

D<br />

umping Moore’s Law is perhaps<br />

the best thing that could happen to<br />

computers, as it’ll hasten the move<br />

away from an aging computer architecture<br />

holding back hardware innovation.<br />

That’s the view of prominent scientist<br />

R. Stanley Williams, a senior fellow in the<br />

Hewlett Packard Labs. Williams played a<br />

key role in the creation of the memristor<br />

by HP in 2008.<br />

Moore’s Law is an observation made by<br />

Intel co-founder Gordon Moore in 1965 that<br />

has helped make devices smaller and faster.<br />

It predicts that the density of transistors<br />

would double every 18- to 24 months, while<br />

the cost of making chips goes down.<br />

Every year, computers and mobile<br />

devices that are significantly faster can be<br />

bought with the same amount of money<br />

thanks in part to guidance from Moore’s<br />

Law. The observation has helped drive up<br />

device performance on a predictable basis<br />

while keeping costs down.<br />

But the predictions tied to Moore’s Law<br />

are reaching their limits as it becomes<br />

harder to make chips at smaller geometries.<br />

That’s a challenge facing all top chipmakers<br />

including Intel, which is changing the way<br />

it interprets Moore’s Law as it tries to cling<br />

on to it for dear life.<br />

Williams is the latest to join a growing<br />

cadre of scientists who predict Moore’s<br />

Law is dying. The end of Moore’s Law<br />

“could be the best thing that has happened<br />

to computing in decades,” Williams wrote<br />

in a research paper published in the<br />

latest issue of IEEE Computing in Science<br />

and Engineering.<br />

The end of Moore’s Law will bring<br />

creativity to chip and computer design<br />

and help engineers and researchers think<br />

outside the box, Williams said. The law<br />

has bottled up innovation in computer<br />

design, he hinted.<br />

So what’s next? Williams predicted there<br />

would be computers with a series of chips<br />

and accelerators patched together, much<br />

like the early forms of superfast computers.<br />

Computing could also be memory driven,<br />

with a much faster bus driving speedier<br />

computing and throughput.<br />

The idea of a memory-driven computer<br />

plays to the strength of HPE, which has<br />

built The Machine along those lines. The<br />

initial version of The Machine has persistent<br />

memory that can be used as both DRAM and<br />

flash storage but could eventually be based<br />

on memristor, an intelligent form of memory<br />

and storage that can track data patterns.<br />

Memory-driven computing could also<br />

break down the current architecture-based<br />

and processor-centric domination of the<br />

computer market. In the longer term,<br />

neuromorphic chips designed around the<br />

way the brain works could drive computing.<br />

In the longer term, neuromorphic chips<br />

that are designed around the way the<br />

brain works could drive computing. HPE<br />

is developing a chip designed to mimic<br />

a human brain, and similar chips are<br />

being developed by IBM, Qualcomm, and<br />

universities in the US and Europe.<br />

“Although our understanding of brains<br />

today is limited, we know enough now to<br />

design and build circuits that can accelerate<br />

certain computational tasks,” Williams wrote.<br />

Applications such as machine learning<br />

highlight the need for new types of<br />

processors. IBM has benchmarked its<br />

neuromorphic chip called TrueNorth as<br />

being faster and more power-efficient than<br />

conventional deep-learning chips like GPUs.<br />

Williams suggested ASICs and FPGAs<br />

(field-programmable gate arrays) could<br />

play a role in driving computing beyond<br />

Moore’s Law. These technologies will use<br />

superfast interconnects such as Gen Z,<br />

which was introduced last year and will<br />

be supported by major chipmakers and<br />

server makers, including Dell and Hewlett<br />

Packard Enterprise.<br />

Quantum computers are also emerging<br />

as a way to replace today’s <strong>PC</strong>s and servers,<br />

but are still decades away from running<br />

everyday applications. J<br />

PhotograPhy: Peter sayer<br />

<strong>July</strong> <strong>2017</strong> www.pcadvisor.co.uk/news 15

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!