DEBUNKING THE DIGITAL DIVIDE

It may turn out that the "digital divide"--one of the most fashionable political slogans of recent years--is largely fiction. As you will recall, the argument went well beyond the unsurprising notion that the rich would own more computers than the poor. The disturbing part of the theory was that society was dividing itself into groups of technology "haves" and "have nots" and that this segregation would, in turn, worsen already large economic inequalities. It's this argument that's either untrue or wildly exaggerated.

We should always have been suspicious. After all, computers have spread quickly, precisely because they've become cheaper to buy and easier to use. Falling prices and skill requirements suggest that the digital divide would spontaneously shrink--and so it has.

The Census Bureau's latest survey of computer use reports narrowing gaps among different income and ethnic groups. In 1997 only 37 percent of people in families with incomes from $15,000 to $24,999 used computers at home or at work. By September 2001, that proportion was 47 percent. Over the same period, usage among families with incomes exceeding $75,000 rose more modestly, from 81 percent to 88 percent. Among all racial and ethnic groups, computer use is rising. Here are the numbers for 2001 compared with similar rates for 1997: Asian-Americans, 71 percent (58 percent in 1997); whites, 70 percent (58 percent); blacks, 56 percent (44 percent); Hispanics, 49 percent (38 percent).

The new figures confirm common sense: many computer skills aren't especially high tech or demanding. The point-and-click technology allows computers to be adopted to many business and home uses without requiring people to become computer experts. Just as you can drive a car without being a mechanic, you can use a computer without being a software engineer.

Now, a new study further discredits the digital divide. The study, by economists David Card of the University of California, Berkeley, and John DiNardo of the University of Michigan, challenges the notion that computers have significantly worsened wage inequality. The logic of how this supposedly happens is straightforward. Computers (the logic holds) raise the demand for high-skilled workers, increasing their wages. Meanwhile, computerization--by automating many routine tasks--reduces the demand for low-skilled workers and, thereby, their wages. The gap between the two widens. Economic inequality increases.

Superficially, wage statistics support the theory. Consider the ratio between workers near the top of the wage distribution (at the 90th percentile) and those near the bottom (at the 10th percentile). In 1999, the first earned $26.05 an hour and the second, $6.05 an hour, reports the Economic Policy Institute in Washington. The ratio of the two--workers at the top compared to workers at the bottom--was 4.3 to 1. By contrast, the ratio in 1980 was only 3.7 to 1. Computerization increased; so did the wage gap. Case closed.

But wait, say Card and DiNardo. The trouble with blaming computers is that the worsening of inequality occurred primarily in the early 1980s. In 1986, the ratio of the high-to-low paid worker was also 4.3--the same as in 1999. With computer use growing, the wage gap should have continued to expand, if it was being driven by a shifting demand for skills. Indeed, Card and DiNardo find much detailed evidence that contradicts the theory. They conclude that computerization doesn't explain "the rise in U.S. wage inequality in the last quarter of the 20th century."

Of course, not all economists accept this brushoff. To Lawrence Katz of Harvard, the spread of computers does promote wage inequality. But few economists have ever believed that new technology is the only influence on inequality, he argues. It can be overwhelmed by other forces. For poor workers, he contends that the economic boom of the 1990s offset the depressing effect of computers on their wages: "Firms were searching high and low for new workers--and they bid up the wages of the unskilled."

Either way, the popular perception of computers' impact on wages is hugely overblown. Lots of other influences count for as much, or more. The worsening of wage inequality in the early 1980s, for example, almost certainly reflected the deep 1981-82 recession and the fall of inflation. Companies found it harder to raise prices. To survive, they concluded that they had to hold down the wages of their least skilled, least mobile and youngest workers. High joblessness allowed them to do so. In 1982, unemployment averaged 9.7 percent.

As a slogan, the "digital divide" brilliantly united a concern for the poor with a faith in technology. It also suggested an agenda: put computers in schools; connect classrooms to the Internet. Well, the agenda has been largely realized. By 2000, public schools had roughly one computer for every four students. Almost all schools were connected to the Internet, as were about three quarters of classrooms. Some students get computer skills that they might miss. Among 10- to 17-year-old students from homes with less than $15,000 of income, about half use computers only at school, reports the Census Bureau.

But whether education and students' life prospects have improved is a harder question. As yet, computers haven't produced broad gains in test scores. As for today's computer skills, they may not be terribly important, in part because technology constantly changes. Frank Levy, an economist at the Massachusetts Institute of Technology who studies how computers alter work, emphasizes the importance of basic reading and reasoning abilities. Often, new computer skills can be taught in a few weeks. But people have to be able to read manuals and follow instructions.

The "digital divide" suggested a simple solution (computers) for a complex problem (poverty). With more computer access, the poor could escape their lot. But computers never were the source of anyone's poverty and, as for escaping, what people do for themselves matters more than what technology can do for them.