Racing the Waves Seismologists try to catch quake tremors quickly enough to save lives On June 28, 1992, a violent earthquake shattered a peaceful Sunday morning in Southern California, jerking millions of residents awake at 4:57 a.m. Centered in the Mojave Desert, some 300 kilometers east of Los Angeles, the magnitude 7.3 quake shook the region hard enough to knock out power in scattered pockets across the southern quarter of the state. Like many of his neighbors, Los Angeles resident Ken Niles turned on the television news for details about the earthquake. "As I was watching, the camera started to shake and the newscasters voiced alarm and started to scramble." Vibrations from a strong aftershock were rattling the television studio, but they had not yet reached Niles' home in the western part of the city. He yelled to his wife upstairs and then had enough time to dash across the room to grab a fragile clock before the floor started bucking. Those few seconds of warning came courtesy of simple physics. The most damaging seismic waves ripple through Earth's crust at 3.7 kilometers per second. That's quick compared to an airplane but downright poky compared to a television signal, which moves 80,000 times faster. Seismologists in Southern California plan to exploit that difference in speed. Over the next few years, they will test a system that provides a warning within the first few seconds of an earthquake. Although the initial goal will not be to notify the public, the prototype network could save lives and property indirectly, proponents say. "You could broadcast a signal giving a few seconds' warning that strong ground shaking is on its way. School children could jump under their desks. Brain surgeons could pull back their knives. Workers dealing with toxic materials could possibly hit shutoff valves," says John R. Filson, chief of the earthquake hazards program at the U.S. Geological Survey (USGS) in Reston, Va. The pilot early warning system will be the final element in an ambitious, 5-year project called TriNet, started last year by USGS, the California Division of Mines and Geology, and the California Institute of Technology in Pasadena. The $20-million program will wire the Southern California landscape with advanced seismic sensors, all sending their measurements into a central computer facility that quickly analyzes and disseminates the information. The initial purpose is to arm officials with vital earthquake data minutes after a disaster, providing statistics that would previously have taken hours or months to obtain. Even before the early warning system comes online, TriNet will offer vital information, say emergency managers. "We feel this is very important. It will definitely aid us in how we carry out our response to earthquakes," says Edward J. Bortugno, chief geologist at the California Office of Emergency Services in Oakland, which coordinates relief efforts following earthquakes. Within minutes of a sizable tremor, TriNet generates maps of shaking intensity. These maps enable emergency crews to identify quickly the sites that have suffered the strongest tremors, where buildings are most likely to have crumbled, says Bortugno. "Having maps of shaking will help direct our response. Search and rescue, for instance, in collapsed buildings is very dicey, and it's something that has to occur awfully quickly if you're going to save a life. When you lose an hour or two, you end up with more dead people in those buildings. This is a way possibly to get at [trapped people] quicker." The TriNet initiative represents a dramatic shift in focus for California seismologists, who once viewed earthquake prediction as their ultimate goal. The 1970s saw several successful predictions of large and small earthquakes around the world, notably the alert in 1975 that saved thousands of lives in Haicheng, China. Hopes fizzled, however, amid some startling failures and a growing realization of the magnitude of the task. "We recognized that prediction really isn't an attainable goal," says Lucile M. Jones, a seismologist with the USGS in Pasadena and a developer of the TriNet project. "We don't want to predict every earthquake, because we have 20 a day. What we need to do is predict which of those earthquakes will grow into major events." This more difficult goal may be unreachable, she adds. After the Northridge earthquake struck Los Angeles on Jan. 17, 1994, seismologists began to concentrate on more attainable public safety objectives, such as improving the response to earthquakes once they've struck. Jones and her colleagues knew that there was ample room for improvement. In the early 1990s, USGS and Caltech had established a pager system for automatically disseminating quake information to key organizations within minutes. During the Northridge disaster, however, the central computer mistook the flood of incoming data for faulty communications signals. The system froze for half an hour, refusing to provide the quake's location. Even after the quake data went out, emergency officials had little to go on. The pager system provided a quake's location and magnitude but no details on which areas suffered the worst shaking. Rescue efforts and media attention centered on the epicentral region in the San Fernando Valley, but they ignored other, more distant sites rattled hard by the quake, says Bortugno. "It was quite some time before we knew of damage south of the epicentral area, in south central Los Angeles, San Pedro, and Santa Monica." The information blackout spread even farther following the 1989 Loma Prieta earthquake, which slammed the San Francisco area during the World Series at 5:04 p.m. on Oct. 17. Television crews that had gathered to cover the baseball game quickly captured images of the spectacular damage in San Francisco. Initial relief efforts were concentrated there, and officials heard nothing about the problems plaguing other towns and cities, says Bortugno. "It took long into the night before we knew of damage in Santa Cruz," where part of a shopping mall collapsed and killed six people. TriNet designers hope to avoid similar breakdowns in communication by setting up sensors at 670 sites around Southern California. Unlike older networks, which included many seismometers that recorded and transmitted analog information, TriNet will use only digital equipment, thus reducing the chances of a communication malfunction like the one during Northridge, says Egill Hauksson, a seismologist at Caltech and TriNet leader. As its cornerstone, the system will have 250 sites sending information continuously, making possible rapid processing of the measurements. TriNet organizers have designed a hardened communication system -- one with special telephone lines, radios, and microwave transmitters -- so that most sensors can continue transmitting data even during a strong earthquake, says Hauksson. Within 3 to 5 minutes, the central computer will produce a map showing the intensity of shaking around the region. These so-called shakemaps are already available on the Internet for earthquakes in Southern California above magnitude 4.0. To avoid potential traffic jams on the Internet during a disaster, critical organizations such as the Office of Emergency Services will receive the shakemaps through a direct computer link to Caltech. In addition to speeding up rescue operations, rapid measurements of shaking provide other benefits, says James D. Goltz, manager of earthquake programs at Caltech. The state can quickly estimate the number of displaced persons needing shelter and assess financial damages. These estimates, in turn, can expedite the process leading to a presidential disaster declaration, possibly making federal funds available days earlier. Utilities and industries can use the almost instant information to determine which facilities may require repairs and which probably escaped unharmed. In the past, some large companies set up strong-motion sensors at their facilities to obtain this information, but it took months or even years to process such recordings, making them useless for directing rapid postquake repairs. Almost 3 dozen large electric transformers broke in the aftermath of a strong Los Angeles earthquake in 1971. "We had several spectacular fires from 6 hours to 6 months after the event," says Ron Tognazzini of the Los Angeles Department of Water and Power. With information on shaking from TriNet, he says, "we could make decisions to depower certain transformers and inspect their insides." In another unsettling example, the 1971 earthquake damaged an earthen dam in Los Angeles. The top of the dam collapsed to within a handsbreadth of the water level in the reservoir, just shy of unleashing a major flood. There were no measurements on how hard this area had shaken in the earthquake, making it difficult to improve dam designs for future tremors. "In 1971, we had only guesses. Now we will know exactly what they were subjected to," says Tognazzini. In the same way, engineers could use TriNet data to evaluate damage to buildings and judge the best way to retrofit structures. "When an earthquake happens and a building falls down, we need to know what the ground motions at that site were," says Jones. In the past, though, there were not enough sensors in urban sections of Los Angeles to provide that sort of information. The new seismic system owes its existence as much to technological innovations as to past catastrophes. Improvements in seismic sensors and computers have enabled researchers to design a much speedier, hardier system than anything available before. However, it took a major earthquake to bring that system to life. After the Northridge disaster, the Federal Emergency Management Agency (FEMA) granted close to $1 billion in aid to mitigate damage from future earthquakes in Southern California. As part of this package, the state received some $50 million of discretionary funding with few strings attached. The Office of Emergency Services decided to use this money to pay for much of the TriNet system. Although TriNet proponents have advertised its multiple uses, state officials were sold on one feature in particular: the pilot early warning system. At this point in the project, administrators are only beginning to explore how to set up the warning system. In certain earthquakes -- a magnitude 8 on the southern San Andreas fault, for example -- TriNet could provide tens of seconds of warning to downtown Los Angeles and other regions far from the fault line. For earthquakes centered below the city, however, the best warning may come only seconds before the waves or even after the initial, less damaging jitters start. Emergency managers will try over the next several years to determine how best to use such alerts. "It will take some careful thought," says Bortugno. "There are lots of questions. Should this be like an emergency broadcast system warning? What should it be? Who would get it?" For some industries, the benefits are obvious. Power companies, for example, suffer damages during earthquakes when overhead lines slap together and melt, shorting out the electrical grid, says Tognazzini. It would take only one-twentieth of a second to de-energize lines -- an act that would prevent this type of power outage. Turning out the lights on 1.3 million customers has costly implications, however. "You have to weigh the possibilities of throwing out millions of dollars of transactions, which are going across the wires and happening in computers," says Tognazzini. He hasn't tried calculating the impact of a false alarm yet. "We're just scared to death of what the cost would be to some of our larger customers." During the pilot phase of TriNet, the power department may try the early cutoff with only a tiny percentage of households and then evaluate it. Emergency managers and seismologists express doubt that the public would benefit directly from just a few seconds of warning. "If the populace is not educated on what to do, it could cause panic. The thing is, the whole earthquake is over in 2 minutes. It's not the time to get in your car and drive to pick up your kids at school," says Filson. The alerts may be most useful to certain groups that can be trained in how to respond. Workers in high-rises, which sway during quakes, could move to the center of these buildings to prevent being tossed through windows. Emergency teams could take fire trucks and ambulances out of their garages. In general, people may expect more from a warning system than it can deliver. In a survey conducted a decade ago in California, employees of small, medium, and large businesses said they would want at least a half-minute warning, an unlikely span for any but distant quakes. They also wanted people, rather than automated systems, to decide how to respond to warnings, a step that would eat up many critical seconds. Nonetheless, citizens may demand access to even the short warnings that will come out of TriNet. Already, some Los Angeles residents harbor the false belief that government scientist are withholding knowledge about the time and dates of future quakes, says Jones. By limiting access to quake warnings, TriNet could feed public distrust of academic and government authorities. To move beyond the pilot network and develop a true early warning system, emergency managers would have to invest in a network far more expensive than TriNet. Southern California has so many earthquake-generating faults that an effective system would require roughly four times as many seismic stations as TriNet is slated to have, says Jones. The TriNet project does have its skeptics. "Just because we paid for it, doesn't mean we endorse it. It's too soon to tell what the utility of the system will be for emergency managers," says Stuart Nishenko of FEMA in Washington, D.C. Some recall the computer problems encountered during the Northridge quake and wonder whether the new system will perform as advertised. The critical issue for potential users is reliability, says Stephanie H. Masaki-Schatz, Manager for Corporate Safety and Emergency Planning at ARCO oil company in Los Angeles. "It's the confidence that there will be redundancy, so when an event does happen, the data will be readily available and accurate." The TriNet team plans to run the new computer system through mock earthquakes to evaluate its performance under pressure. The ultimate test will come the next time seismic waves race through Southern California, striking down buildings like so many children's toys. When that will occur, seismologists sadly admit, nobody knows.