Advertisement
  1. Archive

Cutting-edge computers don't always cut it

The dark side of the corporate information revolution is coming into view.

Companies across the country are seeing their cutting-edge computer systems fail to live up to expectations _ or fail altogether.

In 1996, Pacific Gas & Electric Co. began spending tens of millions of dollars on a system developed by International Business Machines Corp. that would handle customer billing and many other tasks. But deregulation hit the California utility industry much faster than PG&E had expected. By the beginning of 1998, customers would be permitted to choose their energy suppliers. Suddenly, PG&E had the vastly more complicated responsibility of keeping track of fast-changing prices and multiple suppliers of energy.

Although massive, the new IBM-based system couldn't handle the additional burden quickly enough.

"We were heading toward disaster," says Damien P. Brooks, a senior project manager with the giant utility. And so PG&E scrapped the fancy IBM system and went back to the drawing board. Today there is a new four-year project under way, but this time PG&E has kept its 30-year-old first-generation computer system, which it is upgrading and replacing only gradually. (An IBM spokesman referred all questions to PG&E.)

Megaprojects like PG&E's ill-fated collaboration with IBM are disappointing their sponsors at a dizzying rate. They often fall victim to a deadly combination: a fast-changing business landscape, an increasingly complex array of software, and technology boosters' can-do mentality, even when they can't.

The waste is staggering. A 1996 survey of 360 companies by the research firm Standish Group International Inc. in Dennis, Mass., found that 42 percent of corporate information-technology projects were abandoned before completion. U.S. companies spend about $250-billion annually on computer technology.

The bigger the projects are, the more frequently and expensively they tend to fail.

"People get seduced by the technological imperative: Because we can, we do," says Robert Charette, a Springfield, Va., consultant who advises large companies on ways to reduce computer-project risks. "But the backlash is beginning. Senior executives are starting to say no" to vast technology overhauls.

Part of the backlash comes from a new generation of computer-savvy senior executives. They think that overall, computerization has saved money and boosted productivity. They just refuse to spend millions of dollars on bells and whistles that add complexity to systems already prone to breakdown.

Tellingly, the new system under construction at Pacific Gas won't include the latest in point-and-click features for the utility's 1,000 customer-service representatives. Instead, the company will keep old-fashioned keyboard strokes and menus _ 1970s-era technology that is reliable and surprisingly swift.

(This week, a relatively small affiliate of PG&E that serves only business customers retained IBM to develop a separate new billing system, which is expected to be simpler than the abandoned project.)

What's emerging here is a search for a better balance between manpower and computer power, often resulting in some "de-automation" or "de-engineering" of an organization's information systems. The idea is to get the appropriate amount of technology needed to do the job, and no more. "I could build a car with 24 cylinders, but if six or eight will do, why do I need 24 cylinders?" says Paul Knauss, a vice president at Chrysler Financial Corp., the carmaker's financing arm.

This search for the right balance of high tech and human touch may ultimately prove a huge blessing for business organizations. After all, when high tech works, the payoffs can be dazzling. Wal-Mart Stores Inc.'s point-of-sale scanning innovations, which let the company meticulously track inventory, yielded huge competitive advantages, as did American Airlines' Sabre reservation system, which brought mountains of data to travel agents' PCs.

The flip side is that when companies mismanage technology, or the machines themselves fall short, the losses are huge, too. Struggling Oxford Health Plans Inc. has blamed many of its financial problems on gigantic computer snafus. In even more extreme cases, a failed system has helped sink a company, as executives of FoxMeyer Drug Co. discovered in 1996, when the company's new $65-million computer project failed. New software couldn't handle the huge daily volume of orders from pharmacies, among other problems. The breakdown played a significant role in the Carrollton, Texas, drug-distribution company's decision to seek bankruptcy-court protection, company executives have said. FoxMeyer was later acquired by a competitor.

Today, with the understanding that technology infrastructure can break a company, many senior executives are disenchanted. About 50 percent of all technology projects fail to meet chief executives' expectations, according to a survey this year of 376 CEOs by the consulting firm CSC Index in Cambridge, Mass., and the American Management Association.

The blame game is in overdrive. More companies are suing their consultants, and vice versa. More CEOs are firing their chief information officers. Little or none of this has to do with the so-called Year 2000 problem, the anticipated plant shutdowns and data disasters that may occur at the millennium because many computers can't distinguish between the year 2000 and the year 1900.

Meantime, more senior executives are rebelling against constant upgrades of software, which is ever more complex and prone to breakdown. "If our products failed as often as Windows 95, we would have been out of business long ago," says Howard Selland, president of Aeroquip Corp., a Maumee, Ohio, automotive supplier. (A spokesman for Microsoft Corp., Redmond, Wash., says that when the company hears "of any issue or complaint about our products, we work directly with that customer to resolve it as quickly and effectively as possible.")

Selland rose through the technology ranks at Aeroquip and confesses to being a "gadget freak." Yet he issues a blistering attack on the reliability and hidden costs of today's software. He recently calculated the full cost of upgrading his company's 50-person research lab in Ann Arbor., Mich., to Microsoft Corp.'s Windows 95 from Windows 3.1. The bill was $20,000 a person, or a total of $1-million. Such is the seductive pull of technology, however, that Aeroquip just kept rolling out the new stuff.

"Embarrassingly," Selland told a group of technology managers at a recent University of Michigan seminar, "we thereafter converted everyone to Windows 95, and not one single piece of paper justifying the expense circulated through our normal approval chain." He says most office workers at his company use only a small fraction of the computer power, hardware or software, now on their desks.

Today, though, he is resisting the upgrade to Office 97. "You think we need the 10-million lines of code in Office 97?" he asks, adding that he will also resist the upgrade to Windows 98 software when it becomes available.

The current turmoil brings back memories of the robot craze that swept through parts of industrial America in the 1980s. The most infamous example was at General Motors Corp., which in 1985 opened a highly automated factory in the Hamtramck section of Detroit. Multimillion-dollar robots erroneously put Buick fenders on Cadillac cars. Spray-painting robots began spraying each other. Only when the robot population was reduced did productivity improve.

Advocating actual de-automation is rare, but it's already a boutique business in some consulting firms. CSC Index gained fame in the early 1990s for its re-engineering practice. Today, it has a practice that might be labeled de-engineering.

On a recent morning at a resort near Phoenix, Robert Suh, a managing director of CSC, was holding forth on the dangers of overkill in information technology. In the audience were a dozen senior executives of such large companies as AT&T Corp. and Bethlehem Steel Corp. As he ran through his flip charts, Suh compared the high-tech "screen pops" that customer call centers have installed to signify a call from a highly valued customers to the simpler solution of offering those special customers a different phone number. One of the diciest things about high tech, Suh continues, is "the long lead time before you know you've failed."

Another lesson that PG&E's Brooks has taken to heart is that it's better for a company to develop its computer plans in-house, rather than go to big-name consultants who often want to try the very latest and most risky approaches. Pacific Gas is using a team of 300 company employees in its $200-million, four-year effort to rebuild the company's aged computer system. Small consulting firms will help, but only in supporting roles.

The PG&E overhaul illustrates the scope and importance of information technology. The utility's computer system keeps track of starts and stoppages of service for more than 3-million customers. It signals power outages, measures usage, generates 230,000 bills a day, and calculates company revenue.

The renovation of this huge apparatus, designed to be completed in 2001, is one of the most essential tasks under way at the company. Does Brooks feel his career is on the line? "I don't feel that," he says quietly. "But I guess it's there."

YOU MIGHT ALSO LIKE

Advertisement
Advertisement