Skip to main content

Quantitative Seismology, Second Edition

Keiiti Aki Observatoire Volcanologique du Piton de la Fournaise
Paul G. Richards Lamont-Doherty Earth Observatory of Columbia University


Preface

In 1975 I received a surprising letter from Kei Aki, beginning “I wonder if you would be interested in coauthoring a text book on theoretical seismology with me…” 

We had both taught advanced seismology courses at our respective institutions, the Massachusetts Institute of Technology and Columbia University.  But his were informed by many years as a leading researcher. He had worked on everything from the practical study of noise to theoretical frameworks for interpreting free oscillation signals.  I was in my fourth year as an assistant professor, knew nothing about vast areas of seismology, and had focused on some details in source theory and wave propagation which he knew a great deal more about than I did.  But I said yes to his invitation, and thus began a wonderful four-year period of being forced to learn about seismology well enough to write explanations of the underlying theory. 

At that time, in the mid to late 1970s, the concept of quantifying seismic sources with a moment tensor had just begun to take hold.  Film chips of analog data from the Worldwide Standardized Seismographic Network were the best source of seismograms for most geophysicists in academia, but their narrow band and limited dynamic range were problematic (the seismograms, not the geophysicists).  Broadband instruments and digital methods of recording were only beginning to show their potential. 

Kei sent me 300 pages of his teaching notes.  We quickly drafted a sequence of chapter titles and began writing.  In 1978 we sent our first draft to the publisher.  It was promptly rejected as being three times longer than planned, and unmarketable.  I was devastated.  But Kei calmly responded with the suggestion that we could do a little re-organization and offer the material as two volumes.  The original publishers agreed, and after two more years of editing and figure preparation the first edition appeared in February 1980. 

The IRIS Consortium and the Federation of Digital Seismographic Networks emerged in the 1980’s to meet growing needs for high-quality broadband seismic data.  Global, national, regional and local networks of broadband seismometers have since been deployed at thousands of locations, and quantitative seismology is conducted today on a scale that could hardly be imagined in the 1970’s and 1980’s.  Every generation of seismologists correctly knows that it is working at new levels of excellence.  As always, the rationale for support to seismology is multi-faceted: to study the Earth’s internal structure, to conduct research in the physics of earthquakes, to quantify and mitigate earthquake hazard, and to monitor explosions both to evaluate the weapons development programs of a potential adversary and to support initiatives in nuclear arms control. 

These different applications of seismology are illustrated by our own careers.  In 1984 Kei Aki moved from MIT to the University of Southern California, and promoted integration of scientific information about earthquakes and its public transfer as the founding Science Director of the Southern California Earthquake Center.  At the Center, for example, input from earthquake geologists was used together with the fault model of quantitative seismology, to generate output useful for earthquake engineers. In this work, the concept of seismic moment was central to unifying information from plate tectonics, geology, geodesy, and historical and instrumental seismology.  The public transfer of the integrated information was made in the form of probabilistic estimates of earthquake hazards. The Center is still alive and well, long after Kei left for an on-site prediction of volcanic eruptions using seismic signals from an active volcano (Reunion) in the Indian ocean.  In the mid-1980s, I also changed my interests to applied aspects of seismology and began work on practical problems of monitoring compliance with nuclear test ban treaties.  At first the main issue was estimating the size of the largest underground nuclear explosions, in the context of assessing compliance with the 150 kiloton limit of the Threshold Test Ban Treaty.  Later the focus changed to a series of technical issues in detection, location and identification of small explosions, in the context of verification of the Comprehensive Nuclear-Test-Ban Treaty.  This latter treaty became a reality in 1996, and is now associated with an International Data Centre in Vienna and an International Monitoring System currently being built at hundreds of new sites around the world.  In early 1996, Xiaodong Song and I working at Columbia’s Lamont-Doherty Earth Observatory discovered small changes in the travel time with which seismic waves traverse the Earth’s inner core — evidence that we interpreted as due to inner core motion with respect to the rest of the solid Earth. 

These developments in understanding earthquake hazard, explosion monitoring, and Earth’s internal structure and processes, directly show that people do seismology for utterly different reasons.  The common thread is interpretation of seismograms.  The quality of data and ease of data access have greatly improved since 1980, but the fundamentals of seismogram interpretation are little changed.  Progress in applications of seismology relies upon sophisticated methods of analysis, often incorporated into software that students must learn to use soon after beginning graduate school.  The purpose of this book is to provide students and other researchers with the underlying theory essential to understanding these methods — and their pitfalls, and possibilities for improvement. We received numerous requests to keep the 1980 edition in print, and it would have been easy to accept invitations simply to republish.  But I decided in late 1994 to rewrite rather republish, because the emergence of new methods for detecting and recording seismic motions meant that much of the instrumentation chapter would have to be completely reworked, and rewriting could accommodate new problems, up-to-date references, and thousands of small changes as well as major revision of some sections.  The new publisher, University Science Books, working with Windfall Software, enabled this second edition with modern methods of design and typesetting. Dropped from the first edition, are chapters on inverse theory, methods of data analysis, and seismic wave propagation in media with general heterogeneity.  (Note that whole books have been published since 1980 on these subjects.)  Parts of our discarded chapters have been reworked into the chapters that remain.  Numerous sections elsewhere are brought up-to-date (for example, an explanation of the centroid moment tensor).  The revised and rewritten material emphasizes basic methods that have turned out to be most important in practice. 

To facilitate commentary on the second edition, and to provide supplementary material as it may accumulate in future years, a website will be maintained at http://www.LDEO.columbia.edu/~richards/Aki_Richards.html 

Books like this are more than scaled up versions of research papers — teams of people have to work together for years to turn concepts into reality.  I thank Jane Ellis, my editor at University Science Books, for encouragement, tact, patience, help and stamina since we began this project in 1994.  The help of the first edition publisher, W. H. Freeman and Co., in allowing us to use original figures where possible, is gratefully acknowledged.  I thank Paul Anagnostopoulos of Windfall Software who introduced me to ZzTeX and solved electronic design and typesetting problems on this second edition for seven years; Kathy Falato and Violeta Thomsa who took care of my office at Lamont; and Kathy Falato, Elizabeth Jackson, Mary Ellen Oliver, and Gillian Richards for entering text and equations to recreate something like the original edition electronically, thus giving me an entity that could be revised. (How else could piles of notes for revision be merged with a text generated in the 1970s with IBM Selectrics?) 

I received support during the rewriting from Los Alamos National Laboratory in 1997, and from several federal agencies back at Lamont:  the Air Force Office of Scientific Research, the Department of Energy, the Defense Threat Reduction Agency, and the National Science Foundation. 

Many people helped with comments on the first edition, with suggestions for new material, critical reading, supplying references and figures, and checking the new problems.  It is a pleasure here to acknowledge such contributions to the second edition from Duncan Agnew, Joe Andrews, Yehuda Ben-Zion, Phil Cummins, Steve Day, Tony Dahlen, Wen-xuan Du, Goran Ekstrom, Karen Fischer, Steve Grand, John Granville, David Harkrider, Klaus Jacob, Bruce Julian, Richard Katz, Vitaly Khalturin, Debi Kilb, Won-Young Kim (who selected the broadband seismogram shown in red on the cover, and the filtered versions with all their different character as shown also in Figure 12.1), Boris Kostrov, Anyi Li, Wenyi Li, Gerhard Müller, Jeffrey Park, Mike Ritzwoller, Peter Shearer, Jinghua Shi, Bob Smith, Stan Whitcomb, Rüdi Widmer, Bob Woodward, and Jian Zhang. 

Jody Richards has stayed with all this, and with me, since the very beginning.  I owe her more than thanks.  And now it’s done, maybe there is more time to dance together.  I hope so. 

Paul G. Richards
Palisades, New York, June 2002