Advertisement

Our coronavirus coverage is free for the first 24 hours. Find the latest information at tampabay.com/coronavirus. Please consider subscribing or donating.

  1. Archive

Digital divide isn't what we thought

It may turn out that the "digital divide" _ one of the most fashionable of recent political slogans _ is largely fiction. As you will recall, the argument went well beyond the unsurprising notion that the rich would own more computers than the poor. The disturbing part of the theory was that society was dividing itself into groups of technology "haves" and "have nots" and that this segregation would, in turn, worsen already large economic inequalities. It's this argument that's either untrue or wildly exaggerated.

We always should have been suspicious. After all, computers have spread quickly, precisely because they've become cheaper to buy and easier to use. Falling prices and skill requirements suggest that the digital divide would spontaneously shrink _ and so it has.

The Census Bureau's latest survey of computer use reports narrowing gaps among different income and ethnic groups. In 1997 only 37 percent of people in families with incomes from $15,000 to $24,999 used computers at home or at work. By September 2001, it was 47 percent. Over the same period, usage among families with incomes exceeding $75,000 rose more modestly, from 81 percent to 88 percent. Among racial and ethnic groups, computer use is rising. Here are the numbers for 2001 compared with 1997: Asian-Americans, 71 percent (58 percent in 1997); whites, 70 percent (58 percent); blacks, 56 percent (44 percent); Hispanics, 49 percent (38 percent).

The new figures confirm common sense: many computer skills aren't especially high-tech or demanding. Now, a new study by two economists further discredits the digital divide. David Card of the University of California at Berkeley and John DiNardo of the University of Michigan challenge the notion that computers have significantly worsened wage inequality.

The logic of how this supposedly happens is straightforward. Computers raise the demand for high-skilled workers, increasing their wages. Meanwhile, computerization reduces the demand for low-skilled workers and, thereby, their wages. The gap between the two widens.

Superficially, wage statistics support the theory. In 1999, workers at the 90th percentile of wage distribution earned $26.05 an hour; meanwhile, workers at the 10th percentile earned $6.05 an hour, reports the Economic Policy Institute. The ratio of the two _ workers at the top compared to workers at the bottom _ was 4.3 to 1. By contrast, the ratio in 1980 was only 3.7 to 1. Computerization increased; so did the wage gap. Case closed.

But wait, say Card and DiNardo. The trouble with blaming computers is that the worsening of inequality occurred primarily in the early 1980s. In 1986, the ratio of the high- to low-paid worker was also 4.3 _ the same as in 1999. With computer use growing, the wage gap should have continued to expand, if it was being driven by a shifting demand for skills. Card and DiNardo conclude that computerization doesn't explain "the rise in U.S. wage inequality in the last quarter of the 20th century."

Of course, not all economists accept this brushoff. To Lawrence Katz of Harvard, computers do promote wage inequality. But few economists have ever believed that new technology is the only influence on inequality, he argues. It can be overwhelmed by other forces. He contends that the economic boom of the 1990s offset the depressing effect of computers on poor workers' wages.

Either way, the popular perception of computers' impact on wages is overblown. Lots of other influences count for as much, or more. The worsening of wage inequality in the early 1980s, for example, almost certainly reflected the deep 1981-82 recession and lower inflation. Companies found it harder to raise prices. They concluded that they had to hold down the wages of their least valuable workers.

As a slogan, the "digital divide" brilliantly united a concern for the poor with a faith in technology. It also suggested an agenda: put computers in schools; connect classrooms to the Internet. Well, the agenda has been largely realized. By 2000, public schools had roughly one computer for every four students.

But whether education and students' life prospects have improved is a harder question. As yet, computers haven't produced broad gains in test scores. As for today's computer skills, they may not be terribly important, in part because technology constantly changes. Often, new computer skills can be taught in a few weeks. But basic reading and reasoning skills remain critical.

The "digital divide" suggested a simple solution (computers) for a complex problem (poverty). But what people do for themselves matters more than what technology can do for them.

Robert J. Samuelson is a columnist for Newsweek.

Washington Post Writers Group

YOU MIGHT ALSO LIKE

Advertisement
Advertisement