ObsPy Tutorial

Seismo-Live: http://seismo-live.org

##### Authors:¶

In [ ]:
%matplotlib inline
import matplotlib.pyplot as plt
plt.style.use('ggplot')
plt.rcParams['figure.figsize'] = 12, 8


ObsPy has clients to directly fetch data via...

• FDSN webservices (IRIS, Geofon/GFZ, USGS, NCEDC, SeisComp3 instances, ...)
• Earthworm
• NERIES/NERA/seismicportal.eu
• NEIC
• SeisHub (local seismological database)

This introduction shows how to use the FDSN webservice client. The FDSN webservice definition is by now the default web service implemented by many data centers world wide. Clients for other protocols work similar to the FDSN client.

#### Waveform Data¶

In [ ]:
from obspy import UTCDateTime
from obspy.clients.fdsn import Client

client = Client("IRIS")
t = UTCDateTime("2011-03-11T05:46:23")  # Tohoku
st = client.get_waveforms("II", "PFO", "*", "LHZ",
t + 10 * 60, t + 30 * 60)
print(st)
st.plot()

• again, waveform data is returned as a Stream object
• for all custom processing workflows it does not matter if the data originates from a local file or from a web service

The FDSN client can also be used to request event metadata:

In [ ]:
t = UTCDateTime("2011-03-11T05:46:23")  # Tohoku
catalog = client.get_events(starttime=t - 100, endtime=t + 24 * 3600,
minmagnitude=7)
print(catalog)
catalog.plot();


Requests can have a wide range of constraints (see ObsPy Documentation):

• time range
• geographical (lonlat-box, circular by distance)
• depth range
• magnitude range, type
• contributing agency

Finally, the FDSN client can be used to request station metadata. Stations can be looked up using a wide range of constraints (see ObsPy documentation):

• network/station code
• time range of operation
• geographical (lonlat-box, circular by distance)
In [ ]:
event = catalog[0]
origin = event.origins[0]

# Münster
lon = 7.63
lat = 51.96

inventory = client.get_stations(longitude=lon, latitude=lat,
print(inventory)


The level=... keyword is used to specify the level of detail in the requested inventory

• "network": only return information on networks matching the criteria
• "station": return information on all matching stations
• "channel": return information on available channels in all stations networks matching the criteria
• "response": include instrument response for all matching channels (large result data size!)
In [ ]:
inventory = client.get_stations(network="OE", station="DAVA",
level="station")
print(inventory)

In [ ]:
inventory = client.get_stations(network="OE", station="DAVA",
level="channel")
print(inventory)


For waveform requests that include instrument correction, the appropriate instrument response information can be attached to waveforms automatically:
(Of course, for work on large datasets, the better choice is to download all station information and avoid the internal repeated webservice requests)

In [ ]:
t = UTCDateTime("2011-03-11T05:46:23")  # Tohoku
st = client.get_waveforms("II", "PFO", "*", "LHZ",
t + 10 * 60, t + 30 * 60, attach_response=True)
st.plot()

st.remove_response()
st.plot()


All data requested using the FDSN client can be directly saved to file using the filename="..." option. The data is then stored exactly as it is served by the data center, i.e. without first parsing by ObsPy and outputting by ObsPy.

In [ ]:
client.get_events(starttime=t-100, endtime=t+24*3600, minmagnitude=7,
filename="/tmp/requested_events.xml")
client.get_stations(network="OE", station="DAVA", level="station",
filename="/tmp/requested_stations.xml")
client.get_waveforms("II", "PFO", "*", "LHZ", t + 10 * 60, t + 30 * 60,
filename="/tmp/requested_waveforms.mseed")
!ls -lrt /tmp/requested*


#### FDSN Client Exercise¶

Use the FDSN client to assemble a waveform dataset for on event.

• search for a large earthquake (e.g. by depth or in a region of your choice, use option limit=5 to keep network traffic down)
In [ ]:


In [ ]:


• search for stations to look at waveforms for the event. stations should..
• be available at the time of the event
• have a vertical 1 Hz stream ("LHZ", to not overpower our network..)
• be in a narrow angular distance around the event (e.g. 90-91 degrees)
• adjust your search so that only a small number of stations (e.g. 3-6) match your search criteria
In [ ]:


In [ ]:


• for each of these stations download data of the event, e.g. a couple of minutes before to half an hour after the event
• put all data together in one stream (put the get_waveforms() call in a try/except/pass block to silently skip stations that actually have no data available)
• print stream info, plot the raw data
In [ ]:


In [ ]:


• correct the instrument response for all stations and plot the corrected data
In [ ]:


In [ ]:



If you have time, assemble and plot another similar dataset (e.g. like before stations at a certain distance from a big event, or use Transportable Array data for a big event, etc.)