As a teenager, his biographers tell us, Henry Ford loved to fix pocket watches. He would open up the backs and poke around inside that marvelous universe of balance wheels, springs, gears and ratchets, then put them all together again to witness the miracle of their motion synchronized to the spinning of the Earth itself. Mass-produced watches were one of the highest achievements of 19th-century ingenuity. In the 1700s, the English inventor John Harrison had labored for a lifetime to make a single clock accurate enough to keep time on an ocean voyage; Ford, before he devoted himself to the motorcar, calculated he could build watches by the millions at a cost of about 30 cents each--although he doubted that many people would ever need to own one.
If he were alive today, Ford would find the world awash in clocks and watches, silently blinking away the minutes of the night from every coffee maker and microwave, but the exercise of taking them apart would be pretty unrewarding. He would see (even on an analog clock, the kind with hands and a dial) only a battery clip and its two wires disappearing into an opaque plastic box, a few flimsy plastic gears and a small electric motor. The heart of the thing is a sliver of quartz crystal oscillating, invisibly, on a circuit board. Not much there to tinker with, unless you happen to have an oscilloscope handy. But why bother? The assembly-line techniques Ford himself pioneered have made most mass-produced consumer goods cheaper to buy new than to disassemble and repair. And in any case there wouldn't be much point in knowing how to fix a modern electric clock, even a cheap one, since a teenager could grow old waiting for it to break.
Surely no single device could symbolize the technological progress of this astonishing century, which has given us the electron microscope, interplanetary rocket, oral contraceptive, nondairy creamer, sneaker and jukebox. Many boys would probably agree that the profusion of timepieces in our society is a less impressive achievement than the brassiere, invented in 1913. But today, on the threshold of a new century, the battery-run clock stands for a profound shift in society's relation to technology: the triumph of the black box. Machines surround us, but they have lost their essential machineness, the quality that inhered in an old manual typewriter like the grit that accumulated in the holes of the o's and had to be laboriously dug out with a bent paper clip. It was the property that came from being built of discrete parts that bore a tangible, visible relation to one another, like the levers and springs that moved the type bar when you struck the typewriter key. Or the whirling, clicking cams inside an old dial telephone, the points and rotors of a pre-electronic automobile-ignition system or the system of cables, levers and valves that adjusted the fuel-air mix inside a carburetor. They have all been replaced with silent electronic devices whose structure is visible only under a microscope, and whose workings, at bottom, can be apprehended only with a grounding in quantum mechanics. A black box: you apply an input (power from a battery, say) and obtain a specified output (the time of day), but the intervening processes are a mystery, except to the engineer who designed it.
What a century it has been for machines! At its outset, they burned coal and moved at the ponderous pace of the steam engine, but within three years they were throbbing with the explosive energy of gasoline and taking to the skies. A few decades after Einstein derived the formula for converting mass into energy, the forces that bind together the very atoms of the universe were being put to work to light our cities and broadcast "American Bandstand." Robot spaceships have been to Mars and beyond the solar system. Computers have joined in a worldwide network, exchanging data (their term for "knowledge") at speeds far beyond human comprehension. Heart-lung machines have allowed people to come back to life after their hearts have stopped. Machines have evolved in orders of magnitude of complexity, from the thousand or so pieces needed to make a steam locomotive to the uncounted (by NASA, anyway) millions of parts that go into a space shuttle. Their progress illustrates the three cardinal principles of post-modern technology: miniaturization, digitization and synthesis.
Miniaturization begins with the transistor, which was invented in 1947. Even in its earliest incarnations it was smaller than the smallest vacuum tube it was meant to replace, and--not needing a filament--it consumed far less power and generated virtually no waste heat. And there was almost no limit to how small it could be made, once engineers learned how to etch electronic circuits, almost atom by atom, onto substrates of silicon. As late as the 1950s, the standard kitchen-table radio (AM only) had five vacuum tubes and a few dozen minor parts--resistors, capacitors and coils--hand-wired and individually soldered onto a chassis about the size of a hardcover book. Today that circuitry fits into a matchbook, while a microprocessor smaller than a postage stamp can carry the equivalent of 7.5 million transistors. The limiting factor in making most electronic appliances smaller is not the size of the electronic components but the human interface. There's no point in reducing the size of a laptop much further, pending the evolution of humans with smaller fingers. Today the impulse for computer miniaturization is no longer portability but the speed of computation itself: reducing the time it takes an electronic signal, traveling at a substantial fraction of the speed of light, to get from one part of the circuit to another.
But miniaturization by itself would have given us only the Walkman, without a new intellectual framework to make use of all those transistors. That was digitization, the conceptual revolution that imposed the binary numeric system on all the unruly knowledge of the world. The earliest computers were intended as giant calculating machines. They employed binary numbers--the system of counting that uses only 0s and 1s--because they correspond to the two unambiguous states of an electrical switch: on or off. As long ago as the 1930s, the British mathematician Alan Turing observed that, on this principle, a machine could be built to solve virtually any problem--a proposition conclusively demonstrated this year when one finally beat the world champion at chess.
From giant calculators it was a short leap to assigning digital values to the letters of the alphabet and other logical symbols. "Knowledge" was thereby transformed into "data" something that could be electronically stored, transmitted and manipulated, so that today any teenage boy sitting at home with a computer tied in to the Internet can instantaneously find the phone number of every Kimberly in the Salt Lake City phone book. And it wasn't just written information that could be treated this way, but visual images and sound as well. From the days of Edison, music had been recorded in analog format--that is, as a continuously varying signal, a physical representation of a sound wave. But music also can be digitized--transformed into a stream of binary digits that can be electronically reassembled into a wave, with (in theory) far less distortion and random noise than an analog recording. Ergo: the Discman.
And can you imagine all those computers with cases of tortoise shell or mahogany? The 20th-century science of organic synthesis began in 1907, in the Yonkers, N.Y., garage of a chemist named Leo Baekeland. Before that, the raw materials of manufacturing were the elemental metals and their alloys (bronze, brass and steel), stone, ceramics and glass. Plus, of course, the fabulous profusion of substances that nature creates out of the same elements (primarily carbon, nitrogen, hydrogen and oxygen) that make up man himself: wood, leather, cotton, wool, rubber, amber, shellac. In fact, shellac (which is derived from the shells of rare Asian beetles) was what Baekeland had set out to duplicate, not for its decorative uses but as electrical insulation. But what he actually produced--a liquid resin that hardened into a tough, infinitely moldable solid he named Bakelite--was infinitely more valuable. By the hundreds of millions, Bakelite telephones, radio receivers, lamps, ashtrays, pens, cameras and cocktail shakers poured from the factories in the following decades.
The same elements turn out to combine with each other, and with chlorine and sulfur, in an almost limitless number of ways, producing today's menagerie of hard and soft plastics, fabrics and films. High-impact polystyrene! Acrylonitrile/butadiene/ styrene! High-density polyethylene, used for detergent bottles, and its faintly disreputable sister molecule, low-density polyethylene, used in dry cleaners' bags! And, from other laboratories, all the pesticides and fungicides and herbicides of modern agriculture (and warfare), bristling with polysyllabic menace. Two great paradigm shifts marked the century's progress in synthetic chemistry. First, as Baekeland realized, science was no longer limited to trying to make ersatz silk or amber, but could improve on the products of nature. What wouldn't a deer, say, give for a Kevlar hide? The other was that with increased understanding of the geometry of organic compounds, it was no longer necessary for chemists to muddle around like alchemists, hoping to chance on the right combination of ingredients, heat, pressure and catalysts to produce something useful; instead, they can figure out what they want, a synthetic hormone, say and work backward from the shape of a desired molecule. The ultimate achievement in that realm will be atomic-scale machines that can assemble molecules directly, element by element--but that's a job for the next century.
Will the next century be up to it? Or will the great burst of technology begin to slow? Taking the long view of things, arguably that most productive years for industrial innovation were from roughly 1875 to 1925, the era that saw the introduction of the telephone, automobile, airplane, electric light, phonograph, motion picture and radio. Only the personal computer and certain medical advances, among the inventions of the last half century, have had a comparable impact on people's daily lives. And, to be fair, the scientists and engineers of the 20th century were unusually blessed with world wars. From antibiotics to Plexiglas to radar--and its popcorn-popping offspring the microwave oven--military necessity has been the mother of some of the most significant inventions of the century, even leaving aside weapons themselves. That kind of luck no one wants to wish on the next generation.
But where, one might ask, will the Henry Fords of tomorrow find inspiration, poking around in the barren innards of VCRs? They'll whack away at keyboards, in the kind of second-order technological virtuosity that society increasingly rewards. Ford wouldn't even know where to begin to fix a Ford today, nor would anyone else unequipped with a $15,000 computerized diagnostic system. Society gained cleaner air, greater fuel efficiency and reliability with the invention of electronic fuel injection, but it lost the hard-won knowledge of how to start a car on a cold morning by propping open the butterfly valve. If we can see farther, it's not only because, like Newton, we're standing on the shoulders of giants we've also got orbiting radiotelescopes, infrared cameras and radar imaging. That's the technological heritage our century bequeaths to the next. Let's see what they can add to it.