Parallel Processing: Rethinking the Computer

Back in 1965, Intel cofounder Gordon Moore predicted that the semiconductor industry could double the number of transistors on a chip every 12 months (he later amended it to 24 months) for about the same cost. And for half a century, Moore's Law has held true, making computers cheaper and faster and more powerful. It seems almost that long that experts have been warning that Moore's Law would eventually run smack into the laws of physics, bringing everyone's giddy ride to an end. It hasn't happened yet. Justin Rattner, the chief technology officer at Intel, insists the company can keep doubling the number of transistors on a processor through several more generations of chips over the next decade.

The trouble isn't capacity; it's speed. A few years ago microprocessors reached 3GHz. You can't make them faster, or they overheat and start to melt. To solve that problem, the industry began making chips that do several tasks at once, instead of doing a single thing faster and faster. These days we're seeing dual-core and quad-core chips—in essence, processors with two or four tiny computer engines on a single chip. Within a decade we will likely see chips with 100 cores, maybe even more, Rattner says.

But that raises a new problem: how to put those tiny side-by-side computer engines to good use. The operating systems aren't set up for it. Neither are the programming languages and development tools. Neither, in fact, are the programmers themselves, who have all grown up writing software to run on a single engine—serially, that is, not in parallel. "For 50 years we've done things one way, and now we're changing to a different model,"says Craig Mundie, chief research and strategy officer at Microsoft, which as the biggest maker of operating systems and programming tools is leading the drive to solve the puzzle. It's the biggest single change -Microsoft has ever faced, Mundie says.

Parallel computing has been around for a long time. But it's mostly been confined to high-end supercomputers. Writing programs for them is incredibly difficult and time--consuming. The challenge now is to make it possible—and cheap—for ordinary programmers to write programs that run in parallel. Mundie predicts big things when (he doesn't say if) Microsoft works it all out. After all, the human brain is itself a massively parallel computer; writing programs that can operate in parallel is the key to making computers that seem more like us and less like machines. "In a sense we are trying to build a crude approximation of what nature does in your brain," says Mundie. "Parallelism is the only way to get there."