If not for Ted Hoff’s curiosity, we’d all be using typewriters to text our BFFs. OK, not quite. But it’s hard to overstate how much Hoff’s invention changed the world, even if he downplays the impulse that led to the first mass-produced microprocessor. “It was just me being nosy,” he says, “That’s part of my nature.”
His inquisitiveness did more than pave the way for teenagers to SMS all day long. The tiny 4004 microprocessor—conceived by Hoff and designed by co-workers at Intel Corp.—was essentially a computer reduced to the size of a small fingernail. Released in 1971, it would lead to the microcomputer, the home computer, and, ultimately, the personal computer. It also catapulted Intel—then a three-year-old memory-chip maker--into the majors. Last week, Hoff and his partners were notified that their work on the 4004 earned them the National Medal of Technology and Innovation—the highest honor for technological achievement that can be awarded by a U.S. president. But success, even the kind that changes the world, is not without its challenges. It was innovations like the 4004 that made Intel king of PCs. These days, with the advent of smart phones and other similar devices, the PC's profit margins aren't what they used to be. That leaves Intel looking to redefine itself once again—in an era where being in everything that's new is key.
Now 72, Hoff—whose first name is Marcian, per family tradition—was Intel’s manager of applications research in April 1969 when a Japanese manufacturer hired the company to make an affordable desktop electronic calculator. Hoff was the visiting group’s liaison, assigned to help its members transfer their design to Intel’s engineers. “But I was curious about what they were doing, [and] I was surprised at how complex their design was,” says Hoff, whose experience was with computers, not calculators.
Busicom, the Japanese customer, envisioned a dozen different integrated circuits, each controlling a different function and some packed with as many as 5,000 transistors. Hoff figured that designing each chip could take as long as six months, especially since there were fewer than a handful of Intel employees with the necessary skills. When he suggested to his boss, Intel cofounder, Robert Noyce that he could simplify the design, Noyce encouraged him.
Hoff applied what he’d learned studying bigger computers. In the end, his design required just four chips that made up a central processing unit (CPU) that, as with larger computers, could be programmed to perform many different functions, from scanning the keyboard to running the printer. “[The new chip] provided enormous flexibility,” says Hoff. When Busicom executives saw the design, they jettisoned their own plans. “They really liked it,” he recalls.
So did Hoff’s bosses at Intel, although the invention saddled them with a difficult dilemma. Their microprocessor would cost $60. The customers for Intel’s main product, memory chips, made room-size computers that they leased out for $2,500 a month. “We had to be clear with our customers that we were not competing with those kinds of computers, which offered much higher performance than our microprocessors,” says Hoff. Fortunately, that was absolutely true—for a while, anyway.
Intel began selling the 4004 in November 1971. “No one was sure what to do with it,” says Leslie Berlin, project historian for the Silicon Valley Archives at Stanford. “Computers were something that sat in rooms, but here was one you could hold in your hand. It was just such a different way of seeing things.” Pretty soon, customers were using them to collect data from oil wells and gas pumps; one dairy farmer redefined the concept of cow chips, using microprocessors to track herd members’ movements. Subsequent generations of the chip—with catchy names like the 8008 and the 8080—were “pretty good sellers,” Hoff says.
In 1981, when IBM chose Intel’s 8088 microprocessor for its PC, “the industry shifted into high gear, and Intel’s fate was sealed,” says Tim Bajarin, president of Creative Strategies, a technology consulting firm. By the mid-1980s, the company abandoned memory chips—it’s original business, as outlined in a one-page business plan, which the two cofounders used to raise $2.5 million in 1968. (Noyce died in 1990; chairman emeritus Gordon E. Moore, the creator of Moore’s law, which stipulates that the number of transistors that can fit on an integrated circuit doubles every two years, is retired.) Intel’s next big chip, the 486, was introduced in 1989 and was at least 100 times faster than Hoff’s original creation. The 1993 Pentium processor—the company began naming its chips after a court declined to allow it to trademark the “386” name—was eight generations removed from the 4004 and 1,500 times faster. Today, Intel’s Core microprocessor for PCs has 560 million transistors; Hoff’s 4004 held just 2,300.
The company is now looking to position itself for what president and CEO Paul Otellini refers to as "pervasive computing," a coming era where every consumer device has a microprocessor inside it. "Our core capability as a company is creating the world's best silicon technology and then deploying it broadly and affordably into one market after another," he says. To stay on top, Intel will have to continue pushing into new sectors like never before. So far, its chips cannot be found in the smart phone market, but the chipmaker is on track to launch its first production of smart phone chips in 2011. It has also spent more than $10 billion on acquisitions in the past year, most of it to buy McAfee, the maker of computer-security software. "With a chip getting into all kinds of devices, security will be a key component," says Bajarin. "The acquisitions are all strategic, enhancing their ability to sell chips that can be completely embedded in cars or radios or machinery."
In some ways, this is just what Hoff imagined 40 years ago, when he thought about how his invention would be used in elevators, traffic signals and other "hidden places," as he calls them. One place is awfully close to his heart; Hoff is on his third pacemaker. "I'm an example of embedded control," he says. "I've got a microprocessor running me."