Facing The Human Factor

Last week the lines went dead. Again. This is not a recording; it's just gotten so darned repetitive that it seems that way. Since the 1990 software meltdown that sent the AT&T system into paralysis, several smaller glitches have plagued the giant company. This time around, long-distance traffic in and through New York was held up, as well as communications for air-traffic controllers; flights were delayed and canceled across the nation.

While AT&T's earlier system woes could be blamed on software flaws, last week's crash was a low-tech series of blunders. Bungle #1: an attempt to switch a New York facility to diesel power (at the request of the local utility) failed. Bungle #2: no one noticed that the equipment was being powered by backup batteries alone, despite a series of alarms. After six hours the batteries didn't have enough juice to make the switch back to city power. Not even the Energizer bunny could have done more.

As the mess hit the papers, AT&T officials were quick to reach out and finger someone: rank-and-file employees who didn't notice the alarms. But top brass later admitted that management decisions had compounded the problem. No employee was stationed in the engine room to watch the diesels, and supervisors had sent technicians away to a class to learn a new alarm system. Backup alarms were either not functioning, disabled or ignored. "It was employee error," says Bob Swanson, secretary treasurer of the Communications Workers of America Local 1150, "but the employee was management." On Friday AT&T chairman Robert Allen released a letter to employees that said, "Maybe we understand today better than we did when the week began how deeply people are counting on us." He announced that an apology would be printed as an advertisement in major newspapers on Monday.

Ma Bell's competitors had already run some ads of their own. Capitalizing on the foul-up, MCI and U.S. Sprint published full-page newspaper ads explaining how AT&T customers can use their services. Even Nynex, which provides New York's local phone service, sidled away from its former parent with ads that asked, "When is a phone company not the phone company?"

For some, the case underscored the dangers of high technology. But some high-tech problems have low-tech roots: the original computer "bug" was an actual arthropod that wriggled its way into a computer relay. And the highest tech can be brought low by dumb-and sometimes dangerous-mistakes. In 1987 the Nuclear Regulatory Commission discovered workers in the control room at Philadelphia Electric Co.'s Peach Bottom plant often snoozed on the job.

The world is an imperfect place: trains don't run on time, and Madonna is still a star. In a more optimistic time, we thought problems caused by human error could be cut by taking humans out of the loop, But "other problems arise" with automated systems, says Jeff Johnson, chairperson of Computer Professionals for Social Responsibility. "You have to decide what sorts of problems you're willing to put up with." Paul Saffo, a fellow at the Menlo Park, Calif., Institute for the Future, says the best answer is "to design for human error," with many fail-safe mechanisms. AT&T, famous for reliability, had circumvented its backups-and might have tarnished its own reputation for some time to come.