The MUSIC (MUltiple SIgnal Classification) algorithm is a rather generic and well-know algorithm in signal process used for the detection of multiple waves within an array of sensors [Schmidt, 1986]. MUSIC was originally used in SuperDARN by Samson et al. [1990] and Bristow et al. [1994] for the estimation of the characteristics of Medium Scale Traveling Ionospheric Disturbances (MSTIDs) detected by SuperDARN Radars. These papers have inspired this implementation of the MUSIC algorithm.
While this code has been written with SuperDARN detected MSTIDs in mind, it may be useful for working with other wave-like perturbations moving across the field of view of some geophysical instrument.
This notebook demonstrates the use of the DaViTPy MUSIC module for MSTID parameter estimation.
The DaViTPy MUSIC module is used by creating a musicArray object, which includes built-in methods for keeping track of all steps in the processing algorithm. As the data is processed, all preceeding variants of the data are saved within the object and a history is created. This makes it very easy to see what has been done with the data at any step along the way.
Written by N.A. Frissell, 4 November 2013.
References
Bristow, W. A., R. A. Greenwald, and J. C. Samson (1994), Identification of high-latitude acoustic gravity wave sources using the Goose Bay HF Radar, J. Geophys. Res., 99(A1), 319–331, doi:10.1029/93JA01470.
Samson, J. C., R. A. Greenwald, J. M. Ruohoniemi, A. Frey, and K. B. Baker (1990), Goose Bay radar observations of Earth-reflected, atmospheric gravity waves in the high-latitude ionosphere, J. Geophys. Res., 95(A6), 7693–7709, doi:10.1029/JA095iA06p07693.
Schmidt, R.O., "Multiple emitter location and signal parameter estimation," Antennas and Propagation, IEEE Transactions on, vol.34, no.3, pp.276,280, Mar 1986, doi:10.1109/TAP.1986.1143830.
#Import the modules we need.
%pylab inline
import datetime
from matplotlib import pyplot as plt
import numpy as np
from davitpy import pydarn
Populating the interactive namespace from numpy and matplotlib
#Choose the radar and time of the data that we want to look at.
#Then, connect to the data source using the standard DaViTPy
#pydarn.sdio.radDataOpen() routine.
rad='wal'
sTime = datetime.datetime(2011,5,9,8)
eTime = datetime.datetime(2011,5,9,19)
myPtr = pydarn.sdio.radDataOpen(sTime,rad,eTime=eTime)
# Now create the data object.
# By creating this object, data is taken from the SuperDARN database and rearranged into a numpy.array with
# the shape (Nr Times, Nr Beams, Nr Gates). Missing data is stored as np.nan.
# By default, this command only loads in data where the ground scatter flag is True.
# The fovModel='GS' indicates to calculate the field-of-view coordinates in using the ground-scatter mapping
# formula. This is standard when looking at MSTIDs using ground scatter. See Bristow et al. [1994] for details.
dataObj = pydarn.proc.music.musicArray(myPtr,fovModel='GS')
# We can list all of the data sets stored in this object. Right now there is only one data set, but
# as we go along, each step of processing will create a new data set.
# Each data set is simply stored as an attribute of the dataObj, but the names of all of the data sets can be
# accessed through the get_data_sets() method.
dataObj.get_data_sets()
['DS000_originalFit']
# Information about the data set is saved in it's metadata dictionary. Some of the information in the metadata
# dictionary is used to control certain plotting and processing variables in this module.
dataObj.DS000_originalFit.printMetadata()
channel: None code: [u'wal', u'i'] coords: geo cp: None dType: dmap dataSetName: DS000_originalFit eTime: 2011-05-09 19:00:00 elevation: None fType: fitex gscat: 1 model: GS name: Wallops Island param: p_l sTime: 2011-05-09 08:00:00 serial: 0 stid: 32
# The music module also keeps track of each step of processing in using the data set's history attribute.
dataObj.DS000_originalFit.printHistory()
2016-07-08 23:46:19.878155 [DS000_originalFit] Original Fit Data
# The 'active' attribute of dataObj is a reference to the most recently used data set of dataObj. It is also
# the default data set used by all DaViTPy MUSIC processing and plotting routines. You can see this command
# produces the same result as the cell above.
dataObj.active.printHistory()
2016-07-08 23:46:19.878155 [DS000_originalFit] Original Fit Data
# We can create an RTI plot and a fan plot from the data we loaded.
# You can see that both range gate and geographic latitude are given on the y-axis, and the solar terminator
# is also shaded in. The terminator does not go below approximately range gate 10. This is because the ground
# scatter mapping formula is not defined for close-in range gates.
fig = pydarn.plotting.musicRTI(dataObj)
# We can also make a fan plot.
plotTime = datetime.datetime(2011,5,9,14)
fig = pydarn.plotting.musicFan(dataObj,time=plotTime)
# We want to focus on just the data that contains the MSTID we are looking for, so we can apply some limits.
# proc.music.defineLimits() does not actually modify the data, it just marks the data by putting an an entry
# into the dataObj.active.metadata dictionary. The limits will be applied later. Right now, you can see
# what data will be eliminated by making a new RTI plot.
pydarn.proc.music.defineLimits(dataObj,gateLimits=[30,45])
fig = pydarn.plotting.musicRTI(dataObj)
# We also to restrict the amount of time we process. Before actually running the music algorithm, we
# will be filtering the data using a FIR filter that will eat up data at the beginning and end of the filter.
# We can calculate exactly how much time that will be if we know some of the filter characteristics and
# the time resolution of the data. This can then be used to give us new start and end times.
# For now, let's say we are going to use a filter with 101 taps and a dataset with 120 s resolution.
numtaps = 101
timeres = 120
#Let's also say that we are interested in the MSTID feature between 1400 and 1600 UT.
sTime_of_interest = datetime.datetime(2011,5,9,14)
eTime_of_interest = datetime.datetime(2011,5,9,16)
#Now calculate the new start and end times...
new_times = pydarn.proc.music.filterTimes(sTime_of_interest, eTime_of_interest, timeres, numtaps)
pydarn.proc.music.defineLimits(dataObj,timeLimits=new_times)
fig = pydarn.plotting.musicRTI(dataObj)
# Now we apply the limits and replot once more.
# Note that many of the processing routines will automatically call applyLimits() before they
# run the processing algorithm.
dataObj.active.applyLimits()
fig = pydarn.plotting.musicRTI(dataObj)
# Note that a new data set was created when we applied the limits.
dataObj.get_data_sets()
['DS000_originalFit', 'DS001_limitsApplied']
# Also note that the history of the new object was updated.
dataObj.active.printHistory()
2016-07-08 23:46:19.878155 [DS000_originalFit] Original Fit Data 2016-07-08 23:46:28.622054 [DS001_limitsApplied] Limits Applied: gate: 30,45; range [km]: 705,1065; time: 2011-05-09/12:19,2011-05-09/17:41
# Before we feed into the MUSIC Algorithm, we don't want our data to have any gaps in time or space.
# Let's start by interpolating in space along the beams.
pydarn.proc.music.beamInterpolation(dataObj)
fig = plt.figure(figsize=(20,10))
ax = fig.add_subplot(121)
pydarn.plotting.musicPlot.musicFan(dataObj,plotZeros=True,dataSet='originalFit',axis=ax,time=plotTime)
ax = fig.add_subplot(122)
pydarn.plotting.musicPlot.musicFan(dataObj,plotZeros=True,axis=ax,time=plotTime)
<davitpy.pydarn.plotting.musicPlot.musicFan at 0x7fe5623f0790>
%debug
# We also want to interpolate in time. timeres=120 [seconds] was set in an earlier cell.
pydarn.proc.music.timeInterpolation(dataObj,timeRes=timeres)
ERROR: No traceback has been produced, nothing to debug.
pydarn.plotting.musicPlot.timeSeriesMultiPlot(dataObj,dataSet='timeInterpolated',dataSet2='beamInterpolated')
# We also want need to calculate a local cartesian grid for each cell we are going to use.
pydarn.proc.music.determineRelativePosition(dataObj)
#The black cell marks the center of the array.
fig = pydarn.plotting.plotRelativeRanges(dataObj,time=plotTime)
# Now filter the data.
# numtaps=101 was set in an above cell. The cutoff_frequencies are in Hz.
filt = pydarn.proc.music.filter(dataObj, numtaps=numtaps, cutoff_low=0.0003, cutoff_high=0.0012)
# At this point, the data has been filtered and saved to a new data set in dataObj.
# Before we look at the data, let's look at the transfer function and impulse response of the data.
fig = filt.plotTransferFunction(xmax=0.004)
fig = filt.plotImpulseResponse()
# Let's look at the filtered RTI plot. We should set autoScale to True since the magnitudes will be much
# lower than the original data.
fig = pydarn.plotting.musicRTI(dataObj,autoScale=True)
# You can see that the filter already marked off the data that you shouldn't use.
# Just applyLimits() and replot.
dataObj.active.applyLimits()
fig = pydarn.plotting.musicRTI(dataObj,autoScale=True)
# Look at a fan plot.
fig = pydarn.plotting.musicFan(dataObj,time=plotTime,autoScale=True)
# We have now run all of the processing needed to feed the data into the spectral analysis and MUSIC.
# Let's print the history just to recap what we have done.
dataObj.active.printHistory()
2016-07-08 23:46:19.878155 [DS000_originalFit] Original Fit Data 2016-07-08 23:46:28.622054 [DS001_limitsApplied] Limits Applied: gate: 30,45; range [km]: 705,1065; time: 2011-05-09/12:19,2011-05-09/17:41 2016-07-08 23:46:30.040799 [DS002_beamInterpolated] Beam Linear Interpolation 2016-07-08 23:46:31.773818 [DS003_timeInterpolated] Time Linear Interpolation 2016-07-08 23:46:33.882456 [DS004_filtered] Filter: blackman, Nyquist: 0.00416666666667 Hz, Cuttoff: [0.0003, 0.0012] Hz, Numtaps: 101 2016-07-08 23:46:35.684305 [DS005_limitsApplied] Limits Applied: time: 2011-05-09/14:01,2011-05-09/16:01
# First thing to do is to calculate the FFT of every cell...
pydarn.proc.music.calculateFFT(dataObj)
# We can look at the spectrum of select cells...
pydarn.plotting.spectrumMultiPlot(dataObj,xlim=(-0.0025,0.0025))
pydarn.plotting.musicPlot.spectrumMultiPlot(dataObj,plotType='magnitude',xlim=(0,0.0025))
pydarn.plotting.musicPlot.spectrumMultiPlot(dataObj,plotType='phase',xlim=(0,0.0025))
# We can also plot the full spectrum. Here, every FFT bin contains 16 slices,
# showing the data for each of the 16 radar beams from left to right.
# Range gates are shown on the y-axis.
pydarn.plotting.plotFullSpectrum(dataObj,xlim=(0,0.00175))
{'cbar_label': 'ABS(Spectral Density)', 'cbar_pcoll': <matplotlib.collections.PolyCollection at 0x7fe561a6f750>}
# Calculate the cross-spectral matrix Dlm.
pydarn.proc.music.calculateDlm(dataObj)
# We can look at the Dlm matrix. This usually isn't necessary for routine processing, but
# it is good to know where the numbers are going...
pydarn.plotting.plotDlm(dataObj)
# Now, we finally run detect the horizontal wave numbers.
pydarn.proc.music.calculateKarr(dataObj)
# We can then plot the final result.
pydarn.plotting.musicPlot.plotKarr(dataObj)
# Again, we can see all of the history gone into processing this plot.
dataObj.active.printHistory()
2016-07-08 23:46:19.878155 [DS000_originalFit] Original Fit Data 2016-07-08 23:46:28.622054 [DS001_limitsApplied] Limits Applied: gate: 30,45; range [km]: 705,1065; time: 2011-05-09/12:19,2011-05-09/17:41 2016-07-08 23:46:30.040799 [DS002_beamInterpolated] Beam Linear Interpolation 2016-07-08 23:46:31.773818 [DS003_timeInterpolated] Time Linear Interpolation 2016-07-08 23:46:33.882456 [DS004_filtered] Filter: blackman, Nyquist: 0.00416666666667 Hz, Cuttoff: [0.0003, 0.0012] Hz, Numtaps: 101 2016-07-08 23:46:35.684305 [DS005_limitsApplied] Limits Applied: time: 2011-05-09/14:01,2011-05-09/16:01 2016-07-08 23:46:37.458393 [DS005_limitsApplied] Calculated FFT 2016-07-08 23:46:40.103729 [DS005_limitsApplied] Calculated Cross-Spectral Matrix Dlm 2016-07-08 23:46:52.125843 [DS005_limitsApplied] Calculated kArr
# We can use the image processing library in scikit-image to automatically find the peaks representing signals.
pydarn.proc.music.detectSignals(dataObj)
pydarn.plotting.musicPlot.plotKarr(dataObj)
#All of the parameters connected with these signals is located in the sigDectect attribute.
dataObj.active.sigDetect.info
[{'area': 125.0, 'azm': 180.0, 'freq': 0.00055555555555555545, 'k': 0.0010000000000000009, 'kx': 0.0, 'ky': -0.0010000000000000009, 'labelInx': 1, 'lambda': 6283.1853071795804, 'lambda_x': inf, 'lambda_y': -6283.1853071795804, 'max': 0.82138318, 'maxpos': (50, 49), 'order': 2, 'period': 1800.0000000000005, 'vel': 3490.658503988655}, {'area': 415.0, 'azm': 126.86989764584402, 'freq': 0.00055555555555555545, 'k': 0.029999999999999999, 'kx': 0.024, 'ky': -0.017999999999999999, 'labelInx': 2, 'lambda': 209.43951023931956, 'lambda_x': 261.79938779914943, 'lambda_y': -349.0658503988659, 'max': 0.95573765, 'maxpos': (74, 32), 'order': 1, 'period': 1800.0000000000005, 'vel': 116.35528346628863}, {'area': 57.0, 'azm': 74.054604099077139, 'freq': 0.00055555555555555545, 'k': 0.021840329667841555, 'kx': 0.020999999999999998, 'ky': 0.0060000000000000053, 'labelInx': 3, 'lambda': 287.68729239610161, 'lambda_x': 299.19930034188508, 'lambda_y': 1047.1975511965968, 'max': 0.70907271, 'maxpos': (71, 56), 'order': 3, 'period': 1800.0000000000005, 'vel': 159.82627355338974}, {'area': 54.0, 'azm': 72.349875780069894, 'freq': 0.00055555555555555545, 'k': 0.02308679276123039, 'kx': 0.021999999999999999, 'ky': 0.0069999999999999958, 'labelInx': 4, 'lambda': 272.15496635508975, 'lambda_x': 285.59933214452667, 'lambda_y': 897.59790102565569, 'max': 0.69325429, 'maxpos': (72, 57), 'order': 4, 'period': 1800.0000000000005, 'vel': 151.19720353060541}]
# It is possible to manually add a signal peak in.
# All appropriate associated wave parameters will be calculated.
pydarn.proc.music.add_signal(0.013,-0.015,dataObj)
pydarn.plotting.musicPlot.plotKarr(dataObj)
dataObj.active.sigDetect.info
[{'area': 415.0, 'azm': 126.86989764584402, 'freq': 0.00055555555555555545, 'k': 0.029999999999999999, 'kx': 0.024, 'ky': -0.017999999999999999, 'labelInx': 2, 'lambda': 209.43951023931956, 'lambda_x': 261.79938779914943, 'lambda_y': -349.0658503988659, 'max': 0.95573765, 'maxpos': (74, 32), 'order': 1, 'period': 1800.0000000000005, 'vel': 116.35528346628863}, {'area': 125.0, 'azm': 180.0, 'freq': 0.00055555555555555545, 'k': 0.0010000000000000009, 'kx': 0.0, 'ky': -0.0010000000000000009, 'labelInx': 1, 'lambda': 6283.1853071795804, 'lambda_x': inf, 'lambda_y': -6283.1853071795804, 'max': 0.82138318, 'maxpos': (50, 49), 'order': 2, 'period': 1800.0000000000005, 'vel': 3490.658503988655}, {'area': -1, 'azm': 139.08561677997488, 'freq': 0.00055555555555555545, 'k': 0.019849433241279212, 'kx': 0.013000000000000001, 'ky': -0.015000000000000003, 'labelInx': -1, 'lambda': 316.5423027853999, 'lambda_x': 483.32194670612199, 'lambda_y': -418.87902047863901, 'max': 0.71915525, 'maxpos': (63, 35), 'order': 3, 'period': 1800.0000000000005, 'true_max': (0.0056578903+0j), 'vel': 175.85683488077768}, {'area': 57.0, 'azm': 74.054604099077139, 'freq': 0.00055555555555555545, 'k': 0.021840329667841555, 'kx': 0.020999999999999998, 'ky': 0.0060000000000000053, 'labelInx': 3, 'lambda': 287.68729239610161, 'lambda_x': 299.19930034188508, 'lambda_y': 1047.1975511965968, 'max': 0.70907271, 'maxpos': (71, 56), 'order': 4, 'period': 1800.0000000000005, 'vel': 159.82627355338974}, {'area': 54.0, 'azm': 72.349875780069894, 'freq': 0.00055555555555555545, 'k': 0.02308679276123039, 'kx': 0.021999999999999999, 'ky': 0.0069999999999999958, 'labelInx': 4, 'lambda': 272.15496635508975, 'lambda_x': 285.59933214452667, 'lambda_y': 897.59790102565569, 'max': 0.69325429, 'maxpos': (72, 57), 'order': 5, 'period': 1800.0000000000005, 'vel': 151.19720353060541}]
#Signals can also be removed from the database.
pydarn.proc.music.del_signal(4,dataObj)
pydarn.plotting.musicPlot.plotKarr(dataObj)
# To get some insight on how the signal processing algorithm works, we can plot the detected groups.
# You can adjust the threshold and neighborhood keywords on pydarn.proc.music.detectSignals() to tweak
# the autodetection.
pydarn.plotting.plotKarrDetected(dataObj)