Giter Site home page Giter Site logo

smousavi05 / eqtransformer Goto Github PK

View Code? Open in Web Editor NEW
290.0 6.0 146.0 51.32 MB

EQTransformer, a python package for earthquake signal detection and phase picking using AI.

Home Page: https://rebrand.ly/EQT-documentations

License: MIT License

Python 79.49% Jupyter Notebook 20.51%
earthquakes detection phase-picking deep-learning neural-network lstm-neural-networks attention-mechanism transformer multi-task-learning global

eqtransformer's Introduction

eqtransformer's People

Contributors

aaaapril4 avatar computerscienceiscool avatar dependabot[bot] avatar dylanmikesell avatar ecastillot avatar filefolder avatar lchuang avatar maihao14 avatar ostueker avatar smousavi05 avatar speedshi avatar squirrelknight avatar tjnewton avatar wangliang1989 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

eqtransformer's Issues

Bug when loading the model

============================================================================
Running EqTransformer  0.1.59
 *** Loading the model ...
[1]    70227 killed     python makejson.py

the makejson.py is below:

import os
import sys
from EQTransformer.utils.hdf5_maker import stationListFromMseed # 生成台站列表
from EQTransformer.utils.hdf5_maker import preprocessor # 数据转化
from EQTransformer.core.predictor import predictor # 识别信号

mon = '201612'

model = '/Users/liang/work/eqt/EQTransformer/ModelsAndSampleData/EqT_model.h5'
locations = {
    "BBSX": [30.32824, 103.04621,  840.5], "BCPC": [30.94832, 103.31643, 1347.0],
    "BFSC": [30.59908, 103.18747, 1350.0], "BHGC": [30.12775, 103.16326,  776.0],
    "BHLC": [30.51558, 103.42745,  570.0], "BHPC": [30.55102, 102.94482, 1651.0],
    "BJFC": [30.01914, 103.02261,  666.0], "BJHC": [30.60857, 103.58865,  540.0],
    "BLHC": [29.71326, 103.24694,  646.0], "BLSC": [30.28700, 103.29210,  615.0],
    "BLXC": [30.17515, 102.92073,  700.4], "BQFC": [30.68795, 102.74354, 2153.0],
    "BQYC": [30.46657, 103.14805, 1195.3], "BRLZ": [30.99886, 102.79512, 3376.2],
    "BTLG": [30.93672, 103.50188, 1186.0], "BWLZ": [31.02935, 103.17649, 2019.0],
    "BWQC": [30.76124, 103.66290,  601.0], "BWTB": [30.72967, 103.43284,  885.0],
    "BXFC": [30.48838, 102.70589, 1277.0], "BXSC": [30.38091, 102.80905, 1182.0],
    "BYFC": [30.79711, 103.23357, 1546.0], "BYKC": [30.49259, 103.08265, 1631.0],
    "BYQC": [30.58943, 103.30350,  800.0],
}
mdays = [
    '01', '02', '03', '04', '05', '06', '07', '08', '09', '10',
    '11', '12', '13', '14', '15', '16', '17', '18', '19', '20',
    '21', '22', '23', '24', '25', '26', '27', '28', '29', '30', '31'
]

for mday in mdays:
    workdir = mon + mday
    if os.path.exists(workdir) :
        print(workdir, "生成台站列表")
        stationListFromMseed(workdir, locations)
        print(workdir, "数据转化")
        preprocessor(
            preproc_dir = "preproc", mseed_dir = workdir,
            stations_json = 'station_list.json', overlap = 0.3, n_processor = 2
        )
        print(workdir, "识别信号")
        predictor(
            input_dir = workdir + '_processed_hdfs', input_model = model,
            output_dir = 'detections', detection_threshold = 0.1,
            P_threshold = 0.1, S_threshold = 0.1,
            number_of_plots = 100, plot_mode = 'time'
        )
        os.chdir('/Users/liang/work/eqt/gap')
        os.rename('detections', workdir + '_detections')
        print(workdir, "删除临时文件")
        os.system('rm -rf ' + workdir + '_processed_hdfs preproc')
        os.system('rm -rf ' + workdir)

UnboundLocalError: local variable 'batch_outputs' referenced before assignment

Keras==2.4.3
tensorflow==2.2.0
Python 3.7.7
Ubuntu 18.04.5 LTS

When running the mseed_predictor on vertical-component Raspberry Shake data in Mexico, I come across this error which the last couple hours of googling told me that TensorFlow is probably to blame. Hopefully, someone has come across it before.

The full error message:

~/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer-0.1.55-py3.7.egg/EQTransformer/core/mseed_predictor.py in mseed_predictor(input_dir, input_model, stations_json, output_dir, detection_threshold, P_threshold, S_threshold, number_of_plots, plot_mode, loss_weights, loss_types, normalization_mode, batch_size, overlap, gpuid, gpu_limit)
    278             pred_generator = PreLoadGeneratorTest(meta["trace_start_time"], data_set, **params_pred)
    279 
--> 280             predD, predP, predS = model.predict_generator(pred_generator)
    281 
    282             detection_memory = []

~/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/util/deprecation.py in new_func(*args, **kwargs)
    322               'in a future version' if date is None else ('after %s' % date),
    323               instructions)
--> 324       return func(*args, **kwargs)
    325     return tf_decorator.make_decorator(
    326         func, new_func, 'deprecated',

~/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py in predict_generator(self, generator, steps, callbacks, max_queue_size, workers, use_multiprocessing, verbose)
   1531         use_multiprocessing=use_multiprocessing,
   1532         verbose=verbose,
-> 1533         callbacks=callbacks)
   1534 
   1535   def _check_call_args(self, method_name):

~/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py in _method_wrapper(self, *args, **kwargs)
     86       raise ValueError('{} is not supported in multi-worker mode.'.format(
     87           method.__name__))
---> 88     return method(self, *args, **kwargs)
     89 
     90   return tf_decorator.make_decorator(

~/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/keras/engine/training.py in predict(self, x, batch_size, verbose, steps, callbacks, max_queue_size, workers, use_multiprocessing)
   1283             callbacks.on_predict_batch_end(step, {'outputs': batch_outputs})
   1284       callbacks.on_predict_end()
-> 1285     all_outputs = nest.map_structure_up_to(batch_outputs, concat, outputs)
   1286     return tf_utils.to_numpy_or_python_type(all_outputs)
   1287 

UnboundLocalError: local variable 'batch_outputs' referenced before assignment

Depfu Error: No dependency files found

Hello,

We've tried to activate or update your repository on Depfu and couldn't find any supported dependency files. If we were to guess, we would say that this is not actually a project Depfu supports and has probably been activated by error.

Monorepos

Please note that Depfu currently only searches for your dependency files in the root folder. We do support monorepos and non-root files, but don't auto-detect them. If that's the case with this repo, please send us a quick email with the folder you want Depfu to work on and we'll set it up right away!

How to deactivate the project

  • Go to the Settings page of either your own account or the organization you've used
  • Go to "Installed Integrations"
  • Click the "Configure" button on the Depfu integration
  • Remove this repo (smousavi05/EQTransformer) from the list of accessible repos.

Please note that using the "All Repositories" setting doesn't make a lot of sense with Depfu.

If you think that this is a mistake

Please let us know by sending an email to [email protected].


This is an automated issue by Depfu. You're getting it because someone configured Depfu to automatically update dependencies on this project.

mseedpredictor systematically do not picks anything during the last 40 minutes of dayly .mseed file

Hello,

I'm trying to run EQT on several 1day-long .mseed files with mseed_predictor and it seems that i have an issue:

After the picking, when I plot the P-S picks over the raw data it seems that there is systematically a gap of picks during the last 40 min of the trace. (i.e from 23H20min). I'm running EQT over a lot of 1-day long .mseed file and all present the same final gap of picks...

Moreover, it seems that changing the overlap value in the input of the mseed_predictor function make the length of this gap change...

Can you help me on that ? I tried to check deeper on the code but i did not find anything resolving this issue.
Tell me if you need more infos.

Best,
Luc

Note: I checked that there were effectively actually events 'potentially pickable' during the last 40 min of .mseed files.

Differences in .phs and .json file

Hi!

I processed my data without any problems, but when I run the association of the phases, I found some differences.

from the .phs file:

20201231 0 539.0245 30.26 15 15.11 5.000.00
DOBS SL HHE 20201231 0 5 0.00 52.85ES 3
DOBS SL HHZ IP 020201231 0 539.28 0.00 0
CRES SL HHE 20201231 0 5 0.00 47.01ES 2
CRES SL HHZ IP 320201231 0 540.20 0.00 0
BOJS SL HHE 20201231 0 5 0.00 45.81ES 2
BOJS SL HHZ IP 320201231 0 540.25 0.00 0
GBRS SL HHE 20201231 0 5 0.00 56.52ES 1
GBRS SL HHZ IP 020201231 0 540.66 0.00 0
200001

from .json file:

200001 -> ['DOBS/SL.DOBS..HHZ__20201231T000000Z__20210101T000000Z.mseedDOBS2020-12-31 00:05:39.264538',
'CRES/SL.CRES..HHZ__20201231T000000Z__20210101T000000Z.mseedCRES2020-12-31 00:05:35.749538',
'BOJS/SL.BOJS..HHN__20201231T000000Z__20210101T000000Z.mseedBOJS2020-12-31 00:05:39.019539',
'GBRS/SL.GBRS..HHE__20201231T000000Z__20210101T000000Z.mseedGBRS2020-12-31 00:05:40.624536']

The two outputs are not the same, both in number of phases as timing. Is there something I dont understand here?

Thanks!

Error reading the trained model with h5py

Hi,

This is an odd error that is occurring to me on Linux Ubuntu. When I tried to run the predictor I get this error:

Running EqTransformer 0.1.56 *** Loading the model ... Traceback (most recent call last): File "run_OVtrans.py", line 67, in <module> main() File "run_OVtrans.py", line 57, in main OVT.makePredictor(trained_model='EqT_model.h5') File "/home/esteban/Work/EQtrans/C.Pacific/OV_transformer.py", line 94, in makePredictor number_of_cpus=10, File "/home/esteban/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/predictor.py", line 227, in predictor 'f1': f1 File "/home/esteban/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 492, in load_wrapper return load_function(*args, **kwargs) File "/home/esteban/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 583, in load_model with H5Dict(filepath, mode='r') as h5dict: File "/home/esteban/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/utils/io_utils.py", line 191, in __init__ self.data = h5py.File(path, mode=mode) File "/home/esteban/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/files.py", line 408, in __init__ swmr=swmr) File "/home/esteban/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/files.py", line 173, in make_fid fid = h5f.open(name, flags, fapl=fapl) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py/h5f.pyx", line 88, in h5py.h5f.open OSError: Unable to open file (file signature not found)

If I try to read the h5 file individually, I get the same OSErorr:

`In [3]: h5py.File("EqT_model.h5", "r")

OSError Traceback (most recent call last)
in ()
----> 1 h5py.File("EqT_model.h5", "r")

/home/esteban/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/files.py in init(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, **kwds)
406 fid = make_fid(name, mode, userblock_size,
407 fapl, fcpl=make_fcpl(track_order=track_order),
--> 408 swmr=swmr)
409
410 if isinstance(libver, tuple):

/home/esteban/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/files.py in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
171 if swmr and swmr_support:
172 flags |= h5f.ACC_SWMR_READ
--> 173 fid = h5f.open(name, flags, fapl=fapl)
174 elif mode == 'r+':
175 fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/_objects.pyx in h5py._objects.with_phil.wrapper()

h5py/h5f.pyx in h5py.h5f.open()

OSError: Unable to open file (file signature not found)`

What's weird, is that, it only shows up when I execute it on a Linux server. On my macbook pro, predictor works just fine.
I tried updating and downgrading the version of the h5py through anaconda, but didn't work.

Increasing memory consumption when running on MSEED files

I'm running the model on my MSEED files that consist of 1 month of data, 33 stations. Using mseed_predictor with CPU, around station 8, the process gets killed. During the run, I watched the system resources and the memory consumption just keeps increasing and when it's full, the process crashes.

Running on Ubuntu 20.04

Detection error: First station works well second station shows error.

Hi, when running detections (from seeds directly) the first station in my seeds directory works well then I get the following error (I am using a very similar code as the one in the example detections.ipynb).

02-26 08:43 [INFO] [EQTransformer] Finished the prediction in: 0 hours and 5 minutes and 49.27 seconds.
02-26 08:43 [INFO] [EQTransformer] *** Detected: 364 events.
02-26 08:43 [INFO] [EQTransformer] *** Wrote the results into --> " /Users/acosta/EQTransformer-master/examples/detections2/B057_outputs "

02-26 08:43 [INFO] [EQTransformer] Started working on B921, 2 out of 53 ...
02-26 08:43 [INFO] [EQTransformer] 20190901T000000Z__20190902T000000Z.mseed

KeyError Traceback (most recent call last)
/Users/acosta/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/mseed_predictor.py in mseed2nparry(args, matching, time_slots, comp_types, st_name)
447 meta["instrument_type"]=st[0].stats.channel[:2]
--> 448 meta["network_code"]=stations
[st[0].stats.station]['network']
449 meta["receiver_latitude"]=stations_[st[0].stats.station]['coords'][0]

KeyError: 'B921'

During handling of the above exception, another exception occurred:

KeyError Traceback (most recent call last)
in ()
14 overlap=0.3,
15 gpuid=None,
---> 16 gpu_limit=None)

/Users/acosta/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/mseed_predictor.py in mseed_predictor(input_dir, input_model, stations_json, output_dir, detection_threshold, P_threshold, S_threshold, number_of_plots, plot_mode, loss_weights, loss_types, normalization_mode, batch_size, overlap, gpuid, gpu_limit, overwrite)
302 eqt_logger.info(f"{month}")
303 matching = [s for s in file_list if month in s]
--> 304 meta, time_slots, comp_types, data_set = _mseed2nparry(args, matching, time_slots, comp_types, st)
305
306 params_pred = {'batch_size': args['batch_size'],

/Users/acosta/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/mseed_predictor.py in mseed2nparry(args, matching, time_slots, comp_types, st_name)
452 except Exception:
453 meta["receiver_code"]=st_name
--> 454 meta["instrument_type"]=stations
[st_name]['channels'][0][:2]
455 meta["network_code"]=stations_[st_name]['network']
456 meta["receiver_latitude"]=stations_[st_name]['coords'][0]

KeyError: 'B921'

I don't understand where this comes from since the two station messed files were downloaded in the same way and I have the exact same channels for both of them.

H5PY 2.10.0 dependency

Hello,

I wanted to mention that I encountered a recent dependency issue when running a fresh install of eqt via the Conda method. While the latest version of h5py was installed, it appears that it does not work with the provided model. In order to get everything working, I had to uninstall h5py and run the command 'conda install h5py=2.10.0'.

Additionally, it appears that the current version of eqt does not contain the latest updates to the preprocessor module that allow for the specification of the 'preproc_dir'.

-Sincerely, Jesse

Error in EQTransformer Installation

Hi,

I try to install EQTransformer with conda install -c smousavi05 eqtransformer
and it produces this error:

**Solving environment: failed

UnsatisfiableError: The following specifications were found to be in conflict:
  - eqtransformer -> keras -> tensorflow[version='<2.0'] -> tensorboard[version='>=1.10.0,<1.11.0'] -> libgcc-ng[version='>=7.3.0'] -> _openmp_mutex[version='>=4.5'] -> openmp_impl==9999
  - libcblas -> libblas==3.9.0=8_openblas -> liblapack==3.9.0=8_openblas -> liblapacke==3.9.0=8_openblas
  - libcblas -> libblas==3.9.0=8_openblas -> libopenblas[version='>=0.3.12,<0.3.13.0a0,>=0.3.12,<1.0a0'] -> libgcc-ng[version='>=9.3.0'] -> _openmp_mutex[version='>=4.5'] -> openmp_impl==9999
  - libcblas -> libblas==3.9.0=8_openblas -> libopenblas[version='>=0.3.12,<0.3.13.0a0,>=0.3.12,<1.0a0'] -> openblas[version='>=0.3.12,<0.3.13.0a0']
Use "conda info <package>" to see the dependencies for each package.**

The specifications of my computer:

################## PC

NAME="Ubuntu"
VERSION="14.04.6 LTS, Trusty Tahr"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 14.04.6 LTS"
VERSION_ID="14.04"
HOME_URL="http://www.ubuntu.com/"
SUPPORT_URL="http://help.ubuntu.com/"
BUG_REPORT_URL="http://bugs.launchpad.net/ubuntu/"

################### conda info

active environment : eqt
    active env location : /home/ecastillo/anaconda3/envs/eqt
            shell level : 2
       user config file : /home/ecastillo/.condarc
 populated config files : /home/ecastillo/.condarc
          conda version : 4.5.11
    conda-build version : 3.15.1
         python version : 3.7.0.final.0
       base environment : /home/ecastillo/anaconda3  (writable)
           channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/free/linux-64
                          https://repo.anaconda.com/pkgs/free/noarch
                          https://repo.anaconda.com/pkgs/r/linux-64
                          https://repo.anaconda.com/pkgs/r/noarch
                          https://repo.anaconda.com/pkgs/pro/linux-64
                          https://repo.anaconda.com/pkgs/pro/noarch
          package cache : /home/ecastillo/anaconda3/pkgs
                          /home/ecastillo/.conda/pkgs
       envs directories : /home/ecastillo/anaconda3/envs
                          /home/ecastillo/.conda/envs
               platform : linux-64
             user-agent : conda/4.5.11 requests/2.19.1 CPython/3.7.0 Linux/3.16.0-77-generic ubuntu/14.04 glibc/2.19
                UID:GID : 1004:1005
             netrc file : None
           offline mode : False

I have installed EQTransformer on Ubuntu 18.04 and it is fine. However, I don't know if there are any restrictions to install it on Ubuntu 14.04. Do you know what I should do to fix this error?

TypeError in Downloading Continuous Data in Tutorial

Hello Mostafa,

I am currently trying out your code while running:
MacOS Catalina==10.15.7
python==3.7.9
tensorflow==1.14.0
keras==2.3.1

When I come to the first part of the tutorial to import makeStationList I get this warning:

from EQTransformer.utils.downloader import makeStationList Using TensorFlow backend.

If I continue to attempt to use makeStationList, this error occurs:

makeStationList(client_list=["SCEDC"], 
                min_lat=35.50,
                max_lat=35.60, 
                min_lon=-117.80, 
                max_lon=-117.40, 
                start_time="2019-09-01 00:00:00.00", 
                end_time="2019-09-03 00:00:00.00", 
                channel_list=["HH[ZNE]", "HH[Z21]", "BH[ZNE]"],
                filter_network=["SY"],
                filter_station=[])
Traceback (most recent call last):

  File "<ipython-input-2-0a8c305af80c>", line 10, in <module>
    filter_station=[])

TypeError: makeStationList() missing 1 required positional argument: 'json_path'

Any suggestions? Thanks for your time and assistance.

HDF5 error processing mseeds

When running the mseed preprocessor, i.e.:
preprocessor(mseed_dir='mseed/msd', stations_json='station_list.json', overlap=0.3, n_processor=40) I get the following error:

Traceback (most recent call last):
File "3_preprocess_mseed.py", line 4, in
overlap=0.3, n_processor=40)
File "/home/shicks17/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/utils/hdf5_maker.py", line 398, in preprocessor
p.map(process, station_list)
File "/home/shicks17/anaconda3/envs/eqt/lib/python3.7/multiprocessing/pool.py", line 268, in map
return self._map_async(func, iterable, mapstar, chunksize).get()
File "/home/shicks17/anaconda3/envs/eqt/lib/python3.7/multiprocessing/pool.py", line 657, in get
raise self._value
File "/home/shicks17/anaconda3/envs/eqt/lib/python3.7/multiprocessing/pool.py", line 121, in worker
result = (True, func(*args, **kwds))
File "/home/shicks17/anaconda3/envs/eqt/lib/python3.7/multiprocessing/pool.py", line 44, in mapstar
return list(map(*args))
File "/home/shicks17/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/utils/hdf5_maker.py", line 96, in process
HDF.create_group("data")
File "/home/shicks17/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/group.py", line 68, in create_group
gid = h5g.create(self.id, name, lcpl=lcpl, gcpl=gcpl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5g.pyx", line 161, in h5py.h5g.create
ValueError: Unable to create group (name already exists)
HDF5: infinite loop closing library
L,T_top,P,P,Z,FD,PL,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E,E

I have tried running both the conda and compile-from-source versions of eqt, but that doesn't solve the error. I am running h5py=2.10.0

Bug in the massdownloading of data in eqtransformer

I think I have found a bug in the data downloading.

Here is an example of the issue. Toggle the two lines that define the channel_list variable. The first line works to download all 3 components of GS.ID12.00.HH*. However, the second line does not. It just downloads the HHZ component and not the horizontals. This took me a while to figure out, but it definitely causes problems and does not work as one would think given the tutorial examples.

# We are looking at this earthquake in this code: 
# https://earthquake.usgs.gov/earthquakes/eventpage/us6000delq/executive

import os
json_basepath = os.path.join(os.getcwd(),"json/station_list.json")
lon0=-115.136 # [deg] E
lat0=44.4603 # [deg] N
min_lat=lat0-2.5
max_lat=lat0+2.5
min_lon=lon0-3.0
max_lon=lon0+3.0
start_time="2021-02-05 00:07:00.00"
stop_time="2021-02-06 00:08:00.00"

# compare the download results using these two lines (the second would be the example in the tutorial, which does not work).
# channel_list=["HH[ZNE12]"]
channel_list=["HH[ZNE]","HH[Z12]"]

# Create the station list for just GS.ID12
from EQTransformer.utils.downloader import makeStationList
makeStationList(json_path=json_basepath, client_list=["IRIS"], min_lat=min_lat, max_lat=max_lat,
                             min_lon=min_lon, max_lon=max_lon, start_time=start_time, end_time=stop_time,
                             channel_list=channel_list,
                             filter_network=["GM","GD","NP","NQ","RE","UI","UU","SY",  "MB","IE","XP","IW","UM","US","UW"], # skip these networks
                             filter_station=["ID11","ID13","MT04"])

# Download the data in miniseed format
from EQTransformer.utils.downloader import downloadMseeds
downloadMseeds(client_list=["IRIS"], stations_json=json_basepath, output_dir="downloads_mseeds",
                                min_lat=min_lat, max_lat=max_lat, min_lon=min_lon, max_lon=max_lon,
                                start_time=start_time, end_time=stop_time, chunk_size=1,  channel_list=channel_list,
                                n_processor=1)

Is this a bug? Or this behaving as intended?

"unable to open file: name = 'EqT_model.h5'", but I have h5py==2.10.0

Sorry for the long post, but I figured more information was better than too little. I listed all of my environments packages at the end.

While working through the tutorial, I have reached the point where the predictor() function is called for the first option of detecting and picking.

from EQTransformer.core.predictor import predictor

predictor(input_dir= 'downloads_mseeds_processed_hdfs', input_model='EqT_model.h5', output_dir='detections', detection_threshold=0.3, P_threshold=0.1, S_threshold=0.1, number_of_plots=100, plot_mode='time')

When I run this function, I receive this error:

: predictor(input_dir= 'downloads_mseeds_processed_hdfs', 
          input_model='EqT_model.h5', 
          output_dir='detections', 
          detection_threshold=0.3, 
          P_threshold=0.1, 
          S_threshold=0.1, 
          number_of_plots=100, 
          plot_mode='time')
============================================================================
Running EqTransformer  0.1.59
 *** Loading the model ...
Traceback (most recent call last):

  File "<ipython-input-7-e1b33207579b>", line 8, in <module>
    plot_mode='time')

  File "/Users/camithomas/opt/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/predictor.py", line 233, in predictor
    'f1': f1

  File "/Users/camithomas/opt/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 492, in load_wrapper
    return load_function(*args, **kwargs)

  File "/Users/camithomas/opt/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 583, in load_model
    with H5Dict(filepath, mode='r') as h5dict:

  File "/Users/camithomas/opt/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/utils/io_utils.py", line 191, in __init__
    self.data = h5py.File(path, mode=mode)

  File "/Users/camithomas/opt/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/files.py", line 408, in __init__
    swmr=swmr)

  File "/Users/camithomas/opt/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/files.py", line 173, in make_fid
    fid = h5f.open(name, flags, fapl=fapl)

  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper

  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper

  File "h5py/h5f.pyx", line 88, in h5py.h5f.open

OSError: Unable to open file (unable to open file: name = 'EqT_model.h5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

From other Issues posts, I had assumed this was a result of h5py needing to be downgraded but I checked my version and it seems to match the suggested h5py==2.10.0. Any suggestions? When I checked the downloads_mseeds_processed_hdfs folder, there was no EqT_model.h5. Should this model be created and populated in this directory automatically, or do I need to grab this model from a different directory?

Running package versions:

pip list
Package                       Version
----------------------------- -------------------
absl-py                       0.11.0
alabaster                     0.7.12
appdirs                       1.4.4
applaunchservices             0.2.1
appnope                       0.1.2
argh                          0.26.2
argon2-cffi                   20.1.0
astor                         0.8.1
astroid                       2.4.2
astunparse                    1.6.3
async-generator               1.10
atomicwrites                  1.4.0
attrs                         20.3.0
autopep8                      1.5.4
Babel                         2.9.0
backcall                      0.2.0
black                         19.10b0
bleach                        3.2.3
brotlipy                      0.7.0
cachetools                    4.2.1
certifi                       2020.12.5
cffi                          1.14.4
chardet                       4.0.0
click                         7.1.2
cloudpickle                   1.6.0
colorama                      0.4.4
cryptography                  3.3.1
cycler                        0.10.0
decorator                     4.4.2
defusedxml                    0.6.0
diff-match-patch              20200713
docutils                      0.16
entrypoints                   0.3
EQTransformer                 0.1.59
flake8                        3.8.4
flatbuffers                   1.12
future                        0.18.2
gast                          0.2.2
google-auth                   1.24.0
google-auth-oauthlib          0.4.2
google-pasta                  0.2.0
grpcio                        1.32.0
h5py                          2.10.0
idna                          2.10
imagesize                     1.2.0
importlib-metadata            3.4.0
iniconfig                     1.1.1
intervaltree                  3.1.0
ipykernel                     5.4.3
ipython                       7.19.0
ipython-genutils              0.2.0
ipywidgets                    7.6.3
isort                         5.6.4
jedi                          0.18.0
Jinja2                        2.11.2
jsonschema                    3.2.0
jupyter                       1.0.0
jupyter-client                6.1.11
jupyter-console               6.2.0
jupyter-core                  4.7.0
jupyterlab-pygments           0.1.2
jupyterlab-widgets            1.0.0
Keras                         2.3.1
Keras-Applications            1.0.8
Keras-Preprocessing           1.1.2
keyring                       22.0.1
kiwisolver                    1.3.1
lazy-object-proxy             1.4.3
lxml                          4.6.2
Markdown                      3.3.3
MarkupSafe                    1.1.1
matplotlib                    3.3.4
mccabe                        0.6.1
mistune                       0.8.4
mypy-extensions               0.4.3
nbclient                      0.5.1
nbconvert                     6.0.7
nbformat                      5.1.2
nest-asyncio                  1.5.1
notebook                      6.2.0
numpy                         1.19.5
numpydoc                      1.1.0
oauthlib                      3.1.0
obspy                         1.2.2
opt-einsum                    3.3.0
packaging                     20.8
pandas                        1.2.1
pandocfilters                 1.4.3
parso                         0.8.1
pathspec                      0.7.0
pathtools                     0.1.2
pexpect                       4.8.0
pickleshare                   0.7.5
Pillow                        8.1.0
pip                           20.3.3
pkginfo                       1.7.0
pluggy                        0.13.1
prometheus-client             0.9.0
prompt-toolkit                3.0.14
protobuf                      3.14.0
psutil                        5.7.2
ptyprocess                    0.7.0
py                            1.10.0
pyasn1                        0.4.8
pyasn1-modules                0.2.8
pycodestyle                   2.6.0
pycparser                     2.20
pydocstyle                    5.1.1
pyflakes                      2.2.0
Pygments                      2.7.4
pylint                        2.6.0
pyls-black                    0.4.6
pyls-spyder                   0.3.0
pyOpenSSL                     20.0.1
pyparsing                     2.4.7
pyrsistent                    0.17.3
PySocks                       1.7.1
pytest                        6.2.2
python-dateutil               2.8.1
python-jsonrpc-server         0.4.0
python-language-server        0.36.2
pytz                          2020.5
PyYAML                        5.4.1
pyzmq                         22.0.0
QDarkStyle                    2.8.1
QtAwesome                     1.0.1
qtconsole                     5.0.2
QtPy                          1.9.0
regex                         2020.11.13
requests                      2.25.1
requests-oauthlib             1.3.0
rope                          0.18.0
rsa                           4.7
Rtree                         0.9.4
scipy                         1.4.1
Send2Trash                    1.5.0
setuptools                    52.0.0.post20210125
six                           1.15.0
snowballstemmer               2.1.0
sortedcontainers              2.3.0
Sphinx                        3.4.3
sphinxcontrib-applehelp       1.0.2
sphinxcontrib-devhelp         1.0.2
sphinxcontrib-htmlhelp        1.0.3
sphinxcontrib-jsmath          1.0.1
sphinxcontrib-qthelp          1.0.3
sphinxcontrib-serializinghtml 1.1.4
spyder                        4.2.0
spyder-kernels                1.10.1
SQLAlchemy                    1.3.22
tensorboard                   2.0.2
tensorboard-plugin-wit        1.8.0
tensorflow                    2.0.0
tensorflow-estimator          2.0.1
termcolor                     1.1.0
terminado                     0.9.2
testpath                      0.4.4
three-merge                   0.1.1
toml                          0.10.2
tornado                       6.1
tqdm                          4.48.0
traitlets                     5.0.5
typed-ast                     1.4.2
typing-extensions             3.7.4.3
ujson                         4.0.2
urllib3                       1.26.3
watchdog                      0.10.4
wcwidth                       0.2.5
webencodings                  0.5.1
Werkzeug                      1.0.1
wheel                         0.36.2
widgetsnbextension            3.5.1
wrapt                         1.12.1
wurlitzer                     2.0.1
yapf                          0.30.0
zipp                          3.4.0

Got error using predictor

Hello,
I'm trying to reproduce exactly the example in the repository but I'm receiving an error using predictor. Attached below the traceback I receive.

from EQTransformer.core.predictor import predictor
predictor(input_dir='downloads_mseeds_processed_hdfs',   
         input_model='../ModelsAndSampleData/EqT_model.h5',
         output_dir='detections1',
         estimate_uncertainty=False, 
         output_probabilities=False,
         number_of_sampling=5,
         loss_weights=[0.02, 0.40, 0.58],          
         detection_threshold=0.3,                
         P_threshold=0.1,
         S_threshold=0.1, 
         number_of_plots=10,
         plot_mode='time',
         batch_size=500,
         number_of_cpus=4,
         keepPS=False)
============================================================================
Running EqTransformer  0.1.56
 *** Loading the model ...

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-29-21f624fa5cc1> in <module>
     14          batch_size=500,
     15          number_of_cpus=4,
---> 16          keepPS=False)

~/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/predictor.py in predictor(input_dir, input_model, output_dir, output_probabilities, detection_threshold, P_threshold, S_threshold, number_of_plots, plot_mode, estimate_uncertainty, number_of_sampling, loss_weights, loss_types, input_dimention, normalization_mode, batch_size, gpuid, gpu_limit, number_of_cpus, use_multiprocessing, keepPS, spLimit)
    225                                        'FeedForward': FeedForward,
    226                                        'LayerNormalization': LayerNormalization,
--> 227                                        'f1': f1
    228                                         })
    229     model.compile(loss = args['loss_types'],

~/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py in load_wrapper(*args, **kwargs)
    490                 os.remove(tmp_filepath)
    491             return res
--> 492         return load_function(*args, **kwargs)
    493 
    494     return load_wrapper

~/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py in load_model(filepath, custom_objects, compile)
    582     if H5Dict.is_supported_type(filepath):
    583         with H5Dict(filepath, mode='r') as h5dict:
--> 584             model = _deserialize_model(h5dict, custom_objects, compile)
    585     elif hasattr(filepath, 'write') and callable(filepath.write):
    586         def load_function(h5file):

~/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py in _deserialize_model(h5dict, custom_objects, compile)
    271     if model_config is None:
    272         raise ValueError('No model found in config.')
--> 273     model_config = json.loads(model_config.decode('utf-8'))
    274     model = model_from_config(model_config, custom_objects=custom_objects)
    275     model_weights_group = h5dict['model_weights']

AttributeError: 'str' object has no attribute 'decode'

I'm running Ubuntu 18.04.5 LTS with the following conda enviroment named eqt:

# Name                    Version                   Build  Channel
_libgcc_mutex             0.1                 conda_forge    conda-forge
_openmp_mutex             4.5                       1_gnu    conda-forge
absl-py                   0.11.0           py37h89c1867_0    conda-forge
argon2-cffi               20.1.0           py37h4abf009_2    conda-forge
astor                     0.8.1              pyh9f0ad1d_0    conda-forge
async_generator           1.10                       py_0    conda-forge
attrs                     20.3.0             pyhd3deb0d_0    conda-forge
backcall                  0.2.0              pyh9f0ad1d_0    conda-forge
backports                 1.0                        py_2    conda-forge
backports.functools_lru_cache 1.6.1                      py_0    conda-forge
binutils_impl_linux-64    2.35.1               h193b22a_1    conda-forge
binutils_linux-64         2.35                hc3fd857_29    conda-forge
bleach                    3.2.1              pyh9f0ad1d_0    conda-forge
brotlipy                  0.7.0           py37hb5d75c8_1001    conda-forge
c-ares                    1.17.1               h36c2ea0_0    conda-forge
ca-certificates           2020.12.5            ha878542_0    conda-forge
cached-property           1.5.1                      py_0    conda-forge
certifi                   2020.12.5        py37h89c1867_0    conda-forge
cffi                      1.14.4           py37hc58025e_1    conda-forge
chardet                   4.0.0            py37h89c1867_0    conda-forge
cryptography              3.3.1            py37h7f0c10b_0    conda-forge
cycler                    0.10.0                     py_2    conda-forge
dbus                      1.13.6               hfdff14a_1    conda-forge
decorator                 4.4.2                      py_0    conda-forge
defusedxml                0.6.0                      py_0    conda-forge
entrypoints               0.3             pyhd8ed1ab_1003    conda-forge
eqtransformer             0.1.57                   py37_0    smousavi05
expat                     2.2.9                he1b5a44_2    conda-forge
fontconfig                2.13.1            h736d332_1003    conda-forge
freetype                  2.10.4               h7ca028e_0    conda-forge
future                    0.18.2           py37h89c1867_2    conda-forge
gast                      0.4.0              pyh9f0ad1d_0    conda-forge
gcc_impl_linux-64         7.5.0               hda68d29_13    conda-forge
gcc_linux-64              7.5.0               he2a3fca_29    conda-forge
gettext                   0.19.8.1          h0b5b191_1005    conda-forge
glib                      2.66.4               hcd2ae1e_1    conda-forge
google-pasta              0.2.0              pyh8c360ce_0    conda-forge
grpcio                    1.34.0           py37hb27c1af_0    conda-forge
gst-plugins-base          1.14.5               h0935bb2_2    conda-forge
gstreamer                 1.18.2               h3560a44_1    conda-forge
gxx_impl_linux-64         7.5.0               h64c220c_13    conda-forge
gxx_linux-64              7.5.0               h547f3ba_29    conda-forge
h5py                      3.1.0           nompi_py37h1e651dc_100    conda-forge
hdf5                      1.10.6          nompi_h7c3c948_1111    conda-forge
icu                       68.1                 h58526e2_0    conda-forge
idna                      2.10               pyh9f0ad1d_0    conda-forge
importlib-metadata        3.3.0            py37h89c1867_2    conda-forge
importlib_metadata        3.3.0                hd8ed1ab_2    conda-forge
iniconfig                 1.1.1              pyh9f0ad1d_0    conda-forge
ipykernel                 5.4.2            py37h888b3d9_0    conda-forge
ipython                   7.19.0           py37h888b3d9_0    conda-forge
ipython_genutils          0.2.0                      py_1    conda-forge
ipywidgets                7.6.2              pyhd3deb0d_0    conda-forge
jedi                      0.18.0           py37h89c1867_1    conda-forge
jeepney                   0.6.0              pyhd8ed1ab_0    conda-forge
jinja2                    2.11.2             pyh9f0ad1d_0    conda-forge
jpeg                      9d                   h36c2ea0_0    conda-forge
jsonschema                3.2.0                      py_2    conda-forge
jupyter                   1.0.0                      py_2    conda-forge
jupyter_client            6.1.7                      py_0    conda-forge
jupyter_console           6.2.0                      py_0    conda-forge
jupyter_core              4.7.0            py37h89c1867_0    conda-forge
jupyterlab_pygments       0.1.2              pyh9f0ad1d_0    conda-forge
jupyterlab_widgets        1.0.0              pyhd8ed1ab_1    conda-forge
keras                     2.3.1                    py37_0    conda-forge
keras-applications        1.0.8                      py_1    conda-forge
keras-preprocessing       1.1.0                      py_0    conda-forge
kernel-headers_linux-64   2.6.32              h77966d4_13    conda-forge
keyring                   21.7.0           py37h89c1867_0    conda-forge
kiwisolver                1.3.1            py37hc928c03_0    conda-forge
krb5                      1.17.2               h926e7f8_0    conda-forge
lcms2                     2.11                 hcbb858e_1    conda-forge
ld_impl_linux-64          2.35.1               hea4e1c9_1    conda-forge
libblas                   3.9.0                3_openblas    conda-forge
libcblas                  3.9.0                3_openblas    conda-forge
libclang                  11.0.0          default_ha5c780c_2    conda-forge
libcurl                   7.71.1               hcdd3856_8    conda-forge
libedit                   3.1.20191231         he28a2e2_2    conda-forge
libev                     4.33                 h516909a_1    conda-forge
libevent                  2.1.10               hcdb4288_3    conda-forge
libffi                    3.3                  h58526e2_2    conda-forge
libgcc-ng                 9.3.0               h5dbcf3e_17    conda-forge
libgfortran-ng            7.5.0               hae1eefd_17    conda-forge
libgfortran4              7.5.0               hae1eefd_17    conda-forge
libglib                   2.66.4               h164308a_1    conda-forge
libgomp                   9.3.0               h5dbcf3e_17    conda-forge
libgpuarray               0.7.6             h14c3975_1003    conda-forge
libiconv                  1.16                 h516909a_0    conda-forge
liblapack                 3.9.0                3_openblas    conda-forge
libllvm11                 11.0.0               he513fc3_0    conda-forge
libnghttp2                1.41.0               h8cfc5f6_2    conda-forge
libopenblas               0.3.12          pthreads_hb3c22a3_1    conda-forge
libpng                    1.6.37               h21135ba_2    conda-forge
libpq                     12.3                 h255efa7_3    conda-forge
libprotobuf               3.14.0               h780b84a_0    conda-forge
libsodium                 1.0.18               h36c2ea0_1    conda-forge
libssh2                   1.9.0                hab1572f_5    conda-forge
libstdcxx-ng              9.3.0               h2ae2ef3_17    conda-forge
libtiff                   4.2.0                hdc55705_0    conda-forge
libuuid                   2.32.1            h7f98852_1000    conda-forge
libwebp-base              1.1.0                h36c2ea0_3    conda-forge
libxcb                    1.13              h14c3975_1002    conda-forge
libxkbcommon              1.0.3                he3ba5ed_0    conda-forge
libxml2                   2.9.10               h72842e0_3    conda-forge
libxslt                   1.1.33               h15afd5d_2    conda-forge
lxml                      4.6.2            py37h77fd288_0    conda-forge
lz4-c                     1.9.2                he1b5a44_3    conda-forge
mako                      1.1.3              pyh9f0ad1d_0    conda-forge
markdown                  3.3.3              pyh9f0ad1d_0    conda-forge
markupsafe                1.1.1            py37hb5d75c8_2    conda-forge
matplotlib                3.3.3            py37h89c1867_0    conda-forge
matplotlib-base           3.3.3            py37h4f6019d_0    conda-forge
mistune                   0.8.4           py37h4abf009_1002    conda-forge
more-itertools            8.6.0              pyhd8ed1ab_0    conda-forge
mysql-common              8.0.22               ha770c72_1    conda-forge
mysql-libs                8.0.22               h1fd7589_1    conda-forge
nbclient                  0.5.1                      py_0    conda-forge
nbconvert                 6.0.7            py37h89c1867_3    conda-forge
nbformat                  5.0.8                      py_0    conda-forge
ncurses                   6.2                  h58526e2_4    conda-forge
nest-asyncio              1.4.3              pyhd8ed1ab_0    conda-forge
notebook                  6.1.6            py37h89c1867_0    conda-forge
nspr                      4.29                 he1b5a44_1    conda-forge
nss                       3.60                 hb5efdd6_0    conda-forge
numpy                     1.19.4           py37haa41c4c_2    conda-forge
obspy                     1.2.2            py37h03ebfcd_0    conda-forge
olefile                   0.46               pyh9f0ad1d_1    conda-forge
openssl                   1.1.1i               h7f98852_0    conda-forge
packaging                 20.8               pyhd3deb0d_0    conda-forge
pandas                    1.2.0            py37hdc94413_0    conda-forge
pandoc                    2.11.3.1             h7f98852_0    conda-forge
pandocfilters             1.4.2                      py_1    conda-forge
parso                     0.8.1              pyhd8ed1ab_0    conda-forge
pcre                      8.44                 he1b5a44_0    conda-forge
pexpect                   4.8.0              pyh9f0ad1d_2    conda-forge
pickleshare               0.7.5                   py_1003    conda-forge
pillow                    8.0.1            py37h63a5d19_0    conda-forge
pip                       20.3.3             pyhd8ed1ab_0    conda-forge
pkginfo                   1.6.1              pyh9f0ad1d_0    conda-forge
pluggy                    0.13.1           py37he5f6b98_3    conda-forge
prometheus_client         0.9.0              pyhd3deb0d_0    conda-forge
prompt-toolkit            3.0.8              pyha770c72_0    conda-forge
prompt_toolkit            3.0.8                hd8ed1ab_0    conda-forge
protobuf                  3.14.0           py37hcd2ae1e_0    conda-forge
pthread-stubs             0.4               h36c2ea0_1001    conda-forge
ptyprocess                0.6.0                   py_1001    conda-forge
py                        1.10.0             pyhd3deb0d_0    conda-forge
pycparser                 2.20               pyh9f0ad1d_2    conda-forge
pygments                  2.7.3              pyhd8ed1ab_0    conda-forge
pygpu                     0.7.6           py37h161383b_1002    conda-forge
pyopenssl                 20.0.1             pyhd8ed1ab_0    conda-forge
pyparsing                 2.4.7              pyh9f0ad1d_0    conda-forge
pyqt                      5.12.3           py37h89c1867_6    conda-forge
pyqt-impl                 5.12.3           py37he336c9b_6    conda-forge
pyqt5-sip                 4.19.18          py37hcd2ae1e_6    conda-forge
pyqtchart                 5.12             py37he336c9b_6    conda-forge
pyqtwebengine             5.12.1           py37he336c9b_6    conda-forge
pyrsistent                0.17.3           py37h4abf009_1    conda-forge
pysocks                   1.7.1            py37he5f6b98_2    conda-forge
pytest                    6.2.1            py37h89c1867_0    conda-forge
python                    3.7.9           hffdb5ce_0_cpython    conda-forge
python-dateutil           2.8.1                      py_0    conda-forge
python_abi                3.7                     1_cp37m    conda-forge
pytz                      2020.5             pyhd8ed1ab_0    conda-forge
pyyaml                    5.3.1            py37hb5d75c8_1    conda-forge
pyzmq                     20.0.0           py37h5a562af_1    conda-forge
qt                        5.12.9               h9d6b050_2    conda-forge
qtconsole                 5.0.1              pyhd8ed1ab_0    conda-forge
qtpy                      1.9.0                      py_0    conda-forge
readline                  8.0                  he28a2e2_2    conda-forge
requests                  2.25.1             pyhd3deb0d_0    conda-forge
scipy                     1.4.1            py37ha3d9a3c_3    conda-forge
secretstorage             3.3.0            py37h89c1867_0    conda-forge
send2trash                1.5.0                      py_0    conda-forge
setuptools                49.6.0           py37he5f6b98_2    conda-forge
six                       1.15.0             pyh9f0ad1d_0    conda-forge
sqlalchemy                1.3.22           py37h5e8e339_0    conda-forge
sqlite                    3.34.0               h74cdb3f_0    conda-forge
sysroot_linux-64          2.12                h77966d4_13    conda-forge
tensorboard               1.14.0                   py37_0    conda-forge
tensorflow                1.14.0               h4531e10_0    conda-forge
tensorflow-base           1.14.0           py37h4531e10_0    conda-forge
tensorflow-estimator      1.14.0           py37h5ca1d4c_0    conda-forge
termcolor                 1.1.0                      py_2    conda-forge
terminado                 0.9.1            py37h89c1867_1    conda-forge
testpath                  0.4.4                      py_0    conda-forge
theano                    1.0.5            py37h745909e_1    conda-forge
tk                        8.6.10               h21135ba_1    conda-forge
toml                      0.10.2             pyhd8ed1ab_0    conda-forge
tornado                   6.1              py37h4abf009_0    conda-forge
tqdm                      4.55.0             pyhd8ed1ab_0    conda-forge
traitlets                 5.0.5                      py_0    conda-forge
typing_extensions         3.7.4.3                    py_0    conda-forge
urllib3                   1.26.2             pyhd8ed1ab_0    conda-forge
wcwidth                   0.2.5              pyh9f0ad1d_2    conda-forge
webencodings              0.5.1                      py_1    conda-forge
werkzeug                  1.0.1              pyh9f0ad1d_0    conda-forge
wheel                     0.36.2             pyhd3deb0d_0    conda-forge
widgetsnbextension        3.5.1            py37h89c1867_4    conda-forge
wrapt                     1.12.1           py37h4abf009_2    conda-forge
xorg-libxau               1.0.9                h14c3975_0    conda-forge
xorg-libxdmcp             1.1.3                h516909a_0    conda-forge
xz                        5.2.5                h516909a_1    conda-forge
yaml                      0.2.5                h516909a_0    conda-forge
zeromq                    4.3.3                h58526e2_3    conda-forge
zipp                      3.4.0                      py_0    conda-forge
zlib                      1.2.11            h516909a_1010    conda-forge
zstd                      1.4.5                h6597ccf_2    conda-forge

Different number of detected events for multiple runs

Hello,

I'm running this algorithm on Ubuntu 20.04.1 LTS, and I installed it via conda.

I found an interesting thing about EQTransformer. I get different number of detected events if I run this algorithm multiple times even with the same model, the same input mseed files and the same parameters. I'm confused about this. As an example, I firstly download mseed files as the 3.1 in the tutorial document, then

`$ conda activate eqt
(eqt)

yukuan @ x86_64-conda-linux-gnu in ~/mywork/EQT_test [20:28:17]

$ python
Python 3.7.8 | packaged by conda-forge | (default, Jul 31 2020, 02:25:08)
[GCC 7.5.0] on linux
Type "help", "copyright", "credits" or "license" for more information.

from EQTransformer.core.mseed_predictor import mseed_predictor
Using TensorFlow backend.
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint8 = np.dtype([("qint8", np.int8, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint8 = np.dtype([("quint8", np.uint8, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint16 = np.dtype([("qint16", np.int16, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_quint16 = np.dtype([("quint16", np.uint16, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
_np_qint32 = np.dtype([("qint32", np.int32, 1)])
/home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.
np_resource = np.dtype([("resource", np.ubyte, 1)])
mseed_predictor(input_dir= 'downloads_mseeds', input_model='EqT_model.h5', stations_json='station_list.json', output_dir='detections_1', detection_threshold=0.3, P_threshold=0.1, S_threshold=0.1, number_of_plots=99999, plot_mode='time_frequency', overlap=0.3, batch_size=500)
============================================================================
Running EqTransformer 0.1.56
*** Loading the model ...
WARNING:tensorflow:From /home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:4070: The name tf.nn.max_pool is deprecated. Please use tf.nn.max_pool2d instead.

2021-01-21 20:30:19.969596: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
2021-01-21 20:30:20.160487: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 3391955000 Hz
2021-01-21 20:30:20.162329: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x560bd6df2f90 executing computations on platform Host. Devices:
2021-01-21 20:30:20.162375: I tensorflow/compiler/xla/service/service.cc:175] StreamExecutor device (0): ,
2021-01-21 20:30:20.778364: W tensorflow/compiler/jit/mark_for_compilation_pass.cc:1412] (One-time warning): Not using XLA:CPU for cluster because envvar TF_XLA_FLAGS=--tf_xla_cpu_global_jit was not set. If you want XLA:CPU, either set that envvar, or use experimental_jit_scope to enable XLA:CPU. To confirm that XLA is active, pass --vmodule=xla_compilation_cache=1 (as a proper command-line flag, not via TF_XLA_FLAGS) or set the envvar XLA_FLAGS=--xla_hlo_profile.
WARNING:tensorflow:From /home/yukuan/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:422: The name tf.global_variables is deprecated. Please use tf.compat.v1.global_variables instead.

*** Loading is complete!
######### There are files for 3 stations in downloads_mseeds directory. #########
========= Started working on B921, 1 out of 3 ...
20190901T000000Z__20190902T000000Z.mseed
20190902T000000Z__20190903T000000Z.mseed

*** Finished the prediction in: 0 hours and 16 minutes and 17.51 seconds.
*** Detected: 2868 events.
*** Wrote the results into --> " /home/yukuan/mywork/EQT_test/detections_1/B921_outputs "
========= Started working on CA06, 2 out of 3 ...
20190901T000000Z__20190902T000000Z.mseed
20190902T000000Z__20190903T000000Z.mseed

*** Finished the prediction in: 0 hours and 15 minutes and 52.55 seconds.
*** Detected: 2804 events.
*** Wrote the results into --> " /home/yukuan/mywork/EQT_test/detections_1/CA06_outputs "
========= Started working on SV08, 3 out of 3 ...
20190901T000000Z__20190902T000000Z.mseed
20190902T000000Z__20190903T000000Z.mseed

*** Finished the prediction in: 0 hours and 9 minutes and 59.59 seconds.
*** Detected: 1619 events.
*** Wrote the results into --> " /home/yukuan/mywork/EQT_test/detections_1/SV08_outputs "

mseed_predictor(input_dir= 'downloads_mseeds', input_model='EqT_model.h5', stations_json='station_list.json', output_dir='detections_2', detection_threshold=0.3, P_threshold=0.1, S_threshold=0.1, number_of_plots=99999, plot_mode='time_frequency', overlap=0.3, batch_size=500)
============================================================================
Running EqTransformer 0.1.56
*** Loading the model ...
*** Loading is complete!
######### There are files for 3 stations in downloads_mseeds directory. #########
========= Started working on B921, 1 out of 3 ...
20190901T000000Z__20190902T000000Z.mseed
20190902T000000Z__20190903T000000Z.mseed

*** Finished the prediction in: 0 hours and 16 minutes and 36.25 seconds.
*** Detected: 2835 events.
*** Wrote the results into --> " /home/yukuan/mywork/EQT_test/detections_2/B921_outputs "
========= Started working on CA06, 2 out of 3 ...
20190901T000000Z__20190902T000000Z.mseed
20190902T000000Z__20190903T000000Z.mseed

*** Finished the prediction in: 0 hours and 16 minutes and 0.47 seconds.
*** Detected: 2777 events.
*** Wrote the results into --> " /home/yukuan/mywork/EQT_test/detections_2/CA06_outputs "
========= Started working on SV08, 3 out of 3 ...
20190901T000000Z__20190902T000000Z.mseed
20190902T000000Z__20190903T000000Z.mseed

*** Finished the prediction in: 0 hours and 10 minutes and 3.82 seconds.
*** Detected: 1603 events.
*** Wrote the results into --> " /home/yukuan/mywork/EQT_test/detections_2/SV08_outputs "

`

We can see that we get 2868 detections for station B921 in the first run and 2835 detections in the second run.

Installation error

Dear @smousavi05 the following error was occurred during installation using "conda":

`(eqt) saeed@saeed-P453UJ:~$ conda install -c smousavi05 eqtransformer
Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: -
Found conflicts! Looking for incompatible packages.
This can take several minutes. Press CTRL-C to abort.
failed

UnsatisfiableError: The following specifications were found to be incompatible with each other:

Output in format: Requested package -> Available versions`

it seems there are some conflicts! (I'm using python 3.7.10 )

STEAD data formatting and station component recognition

STEAD

I am attempting to wrangle our data outputs into the STEAD format to create our own training model. Are all variables in the format needed or can some columns be left with NA's?

Station Component Recognition

Also, when I downloaded the station list via the makeStationList(), the list shows that a number of the stations in my list are 3 component stations. I made sure to select my desired channels to match the components listed in the station_list when I run downloadMseeds(). However, when I view the pot that color codes each stations data over time as red for 3 component and purple for 1 component, all of my stations appear to be 1 component. I am a bit confused as the station_list said manny of the stations were 3 component and I selected those channels in the function as instructed. Any advice?

Y2000.phs file

Hi, I think there is a small bug in the line that writes the event information (lines 411-414 in associator.py).

Y2000_writer.write("%4d%2d%2d%2d%2d%4.2f%2.0f%1s%4.2f%3.0f%1s%4.2f%5.2f%3.2f\n"%
                   (int(yr),int(mo),int(dy), int(hr),int(mi),float(sec),float(st_lat_DMS[0]), 
                    str(st_lat_DMS[1]), float(st_lat_DMS[2]),float(st_lon_DMS[0]), str(st_lon_DMS[1]), 
                    float(st_lon_DMS[2]),float(depth), float(mag))); 

In the case where minutes is less than 10, the columns are not correct in the resulting phase file. Compare the two events below. The first event has a start time of 2021-10-01 01:33:06.05, but the 0 in front of the six is missing and thus this line has one less column than the next event, start time of 2021-10-01 01:34:43.42. This completely screws up an parser that reads by position.

202010 1 1336.0544 18.20115W14.08 5.000.00
SAC  XP  HHE     202010 1 133 0.00        7.36ES 2
SAC  XP  HHZ IP 0202010 1 133 5.74        0.00   0
BANN XP  HHE     202010 1 133 0.00        8.24ES 0
BANN XP  HHZ IP 0202010 1 133 6.08        0.00   0
FOX  XP  HHE     202010 1 133 0.00        9.07ES 0
FOX  XP  HHZ IP 0202010 1 133 6.51        0.00   0
MFRD XP  HHE     202010 1 133 0.00       11.54ES 0
MFRD XP  HHZ IP 0202010 1 133 8.01        0.00   0
SUNB XP  HHE     202010 1 133 0.00       12.89ES 1
SUNB XP  HHZ IP 2202010 1 13310.02        0.00   0
DDR  XP  HHE     202010 1 133 0.00       16.56ES 0
DDR  XP  HHZ IP 0202010 1 13310.99        0.00   0
WARM XP  HHE     202010 1 133 0.00       23.28ES 2
WARM XP  HHZ IP 0202010 1 13314.83        0.00   0
HLID US  HHE     202010 1 133 0.00       30.15ES 2
HLID US  HHZ IP 1202010 1 13319.06        0.00   0
                                                                  200014
202010 1 13443.4244 18.20115W14.08 5.000.00
EPIC XP  HHE     202010 1 134 0.00       44.92ES 0
EPIC XP  HHZ IP 0202010 1 13443.20        0.00   0
BANN XP  HHE     202010 1 134 0.00       45.66ES 0
BANN XP  HHZ IP 0202010 1 13443.52        0.00   0
MFRD XP  HHE     202010 1 134 0.00       46.15ES 0
MFRD XP  HHZ IP 0202010 1 13443.79        0.00   0
SAC  XP  HHE     202010 1 134 0.00       47.30ES 0
SAC  XP  HHZ IP 0202010 1 13445.22        0.00   0
FOX  XP  HHE     202010 1 134 0.00       48.58ES 0
FOX  XP  HHZ IP 0202010 1 13445.31        0.00   0
IRON XP  HHE     202010 1 134 0.00       48.74ES 0
IRON XP  HHZ IP 2202010 1 13445.49        0.00   0
DDR  XP  HHE     202010 1 134 0.00       52.64ES 0
DDR  XP  HHZ IP 0202010 1 13447.63        0.00   0
TCK  XP  HHE     202010 1 134 0.00       53.66ES 0
TCK  XP  HHZ IP 1202010 1 13448.22        0.00   0
WARM XP  HHE     202010 1 134 0.00       56.86ES 1
WARM XP  HHZ IP 1202010 1 13450.09        0.00   0
                                                                  200015

It would be nice to modify the code to put zeros in front of the single digit numbers that should be two characters long.

Y2000_writer.write("%4d%02d%02d%02d%02d%04.2f%2.0f%1s%4.2f%3.0f%1s%4.2f%5.2f%3.2f\n"%
                   (int(yr),int(mo),int(dy), int(hr),int(mi),float(sec),float(st_lat_DMS[0]), 
                    str(st_lat_DMS[1]), float(st_lat_DMS[2]),float(st_lon_DMS[0]), str(st_lon_DMS[1]), 
                    float(st_lon_DMS[2]),float(depth), float(mag))); 

I guess it is most important for the seconds (%04.2f).

small display message bug in hdf5_maker.py

In hdf5_maker.py line 75.

save_dir = os.path.join(os.getcwd(), str(mseed_dir)+'_processed_hdfs')
if os.path.isdir(save_dir):
print(f' *** " {mseed_dir} " directory already exists!')
inp = input(" * --> Do you want to creat a new empty folder? Type (Yes or y) ")
if inp.lower() == "yes" or inp.lower() == "y":
shutil.rmtree(save_dir)
os.makedirs(save_dir)

The variable 'mseed_dir' in print function should be 'save_dir', otherwise the display message could be misleading.

By the way, I tested your method on some seismic data we are studying, and the results are very nice. Thanks for sharing!

Prediction.py - Tensorflow version control causing Issues on GPUs

Tried switching to GPU in prediction using the optional variables gpuid and gpu_limit; a version control issue was returned with tf.Session no longer supported in tensorflow=2.3.0.

Even after making corrections updating the source code prediction.py to tensorflow=2.3.0. Still had issues with running on GPUs with data transferred to GPUs but no processing.

Sorry for the issue. Happy to give more error logs etc if needed :)

Downloading MSEED data

Hello,

I have just installed EQTransformer and tested the example in the tutorial, which has worked as expected. However, when I try to download MSEED data in a different time period it doesn't work. For example, after

from EQTransformer.utils.downloader import downloadMseeds

if I try

downloadMseeds(client_list=["SCEDC", "IRIS"], stations_json='station_list.json', output_dir="downloads_mseeds", min_lat=35.50, max_lat=35.60, min_lon=-117.80, max_lon=-117.40, start_time="2019-03-01 00:00:00.00", end_time="2019-03-01 05:00:00.00", chunk_size=1, channel_list=[], n_processor=2)

the output is

####### There are 1 stations in the list. #######
[2021-03-22 19:53:54,130] - obspy.clients.fdsn.mass_downloader - INFO: Initializing FDSN client(s) for SCEDC, IRIS.
[2021-03-22 19:53:54,139] - obspy.clients.fdsn.mass_downloader - INFO: Successfully initialized 2 client(s): SCEDC, IRIS.
======= Working on B921 station.

and no MSEED data is downloaded. Is there a step in between that I am missing?

All the best,
Catalina

When the detection threshold is not ideally set

Hi,

I suppose this is what happens when the you don't set the detection threshold properly: the S-wave arrival was not picked, although it is pretty clear on the horizontal components.

I find the figure quite illustrative for those who are learning to use EQTransformer.

2021-02-10 01:44:18 157500

Cheers!

Hot to run my own data?

In the example of your tutoria, the input data required are the xml files and the original mseed files.
I want to ask you if EQT can be used in the following two situations,respectively?

  1. situation 1:
    I have the xml files and the mseed files. But the mseed files have been removed instrument response and filtered data(1-45Hz).
    In this situation, of course I can apply the process in the example of the tutorial. But, you know ,the input files are not exactly same.

  2. situation 2:
    I only have the mseed files that have been removed instrument response and filtered data(1-45Hz). I have the absolutely information of the seismic stations, but not in xml files.

Please forgive my poor English.
Thank you!

TF version requirements

Hi Moustafa,

I am having issues installing with pip and source code because the TF version 2.0.0 does not for on my Mac

Reading https://pypi.org/simple/tensorflow/
No local packages or working download links found for tensorflow==2.0.0
error: Could not find suitable distribution for Requirement.parse('tensorflow==2.0.0')

Here are my TF installations:

tensorboard 2.5.0 pypi_0 pypi
tensorboard-data-server 0.6.0 pypi_0 pypi
tensorboard-plugin-wit 1.8.0 pypi_0 pypi
tensorflow 2.4.1 pypi_0 pypi
tensorflow-estimator 2.4.0 pypi_0 pypi

I use python 3.8.5

I read online that older versions of tf2.2 require an older version of python (3.5, 3.6 or 3.7).
Would it be okay to update the requirements for a later version of tf?
Cheers,
m.

False negatives

Hi,

do you see any false negative on these stations? It is a mining explosion (ML 1.1), with data available for 6 stations, but the event was detected on only one. Most of the traces look noisy, but I think I do "see something" on some of them (e.g. on station WIMM), but I am not an expert at detecting events...

Cheers!

ge falks
ge flt1
sx muhb
sx neub
sx wimm
th bonn

trainer h5 used in predictor() returning error: Cannot create group in read-only mode.

Hello Mustafa,

I have pretty well finished up the R package to take by-hand picked phase arrival data sets and convert them into h5 files to load into eqtransformer.core.predictor. When I do a side-by-side comparison of my output h5 file, https://gitlab.com/Bryanrt-geophys/sac2eqtransformr/-/tree/master/sample_data/Training_h5, to the 100samples.hdf5 in your provided sample data set I don't see any structural differences aside from my objects being 1x6000 rather than 3x6000. This is because sensors only recorded single component data over the time period I am building my sample set from. Is this an issue? If so is there an order that the components are to be listed in the rows of the objects (i.e. HHZ row 1, HHE row 2, HHN row 3)? When I try to use my retraining model, I receive the error: Cannot create group in read-only mode. I am assuming this is an issue with how I built/closed my hdf5 file. Would you have any advice on how to go about fixing this?

The package still has some minor bugs before it's fully functional but here is the function that takes tidied data and creates the h5 file, https://gitlab.com/Bryanrt-geophys/sac2eqtransformr/-/blob/master/R/make_trainer_hdf5.R. I know your work has been in Python and this is R, but I assume the ideas of how you create an h5 should look extremely similar.

_decimalDegrees2DMS

I would suggest renaming _decimalDegrees2DMS to _decimalDegrees2DM. You are not actually computing seconds, instead you have minutes with decimals. Therefore, the function name is misleading and when I was trying to figure out how to interpret the Y2000.phs event times this was ambiguous. There are many codes that do DMS or DM with the minutes having a decimal. I would stick to a name that more accurately describes what the function is actually doing.

This is defined on line 247 of associator.py.

ZeroDivisionError raised by EQTransformer.core.mseed_predictor

Below is a minimum working example of EQTransformer.core.mseed_predictor raising a ZeroDivisionError. The traceback I receive is included below this short script, as is a summary of my working environment.

import EQTransformer
import json
import obspy
import obspy.clients.fdsn
import os

network_code = "HV"
station_code = "KOHD"
location_code = "*"
channel_code = "EH?"
starttime = obspy.UTCDateTime("2020-12-20T00:00:00")
endtime = starttime + 86400

client = obspy.clients.fdsn.Client()
inventory = client.get_stations(
    network=network_code,
    station=station_code,
    location=location_code,
    channel=channel_code,
    endafter=starttime, 
    startbefore=endtime,
    level="channel"
)

stations = dict()
for network in inventory.networks:
    for station in network.stations:
        stations[station.code] = dict(
            network=network.code,
            channels=sorted([channel.code for channel in station.channels]),
            coords=(station.latitude, station.longitude, station.elevation)
        )
with open("station_list.json", "w") as outfile:
    json.dump(stations, outfile)
    
stream = client.get_waveforms(
    network_code, 
    station_code,
    location_code,
    channel_code,
    starttime,
    endtime
)

for trace in stream:
    stats = trace.stats
    network_code = stats.network
    station_code = stats.station
    location_code = stats.location
    channel_code = stats.channel
    starttime = stats.starttime.strftime('%Y%m%dT%H%M%SZ')
    endtime = stats.endtime.strftime('%Y%m%dT%H%M%SZ')
    ddir = os.path.join(network_code, station_code)
    os.makedirs(ddir, exist_ok=True)
    filename = ".".join(
        (
            network_code, 
            station_code,
            location_code,
            "__".join(
                (
                    channel_code,
                    starttime,
                    endtime
                )
            ),
            "mseed"
        )
    )
    path = os.path.join(ddir, filename)
    trace.write(path)
    
EQTransformer.core.mseed_predictor(
    input_dir="HV", 
    input_model="EqT_model.h5",
    stations_json="station_list.json",
    output_dir="detections", 
    detection_threshold=0.3, 
    P_threshold=0.1, 
    S_threshold=0.1, 
    number_of_plots=0,
    overlap=0.3, 
    batch_size=500
)
---------------------------------------------------------------------------
ZeroDivisionError                         Traceback (most recent call last)
<ipython-input-1-285372e8d73c> in <module>()
     81     number_of_plots=0,
     82     overlap=0.3,
---> 83     batch_size=500
     84 )

/home/malcolmw/local/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/mseed_predictor.py in mseed_predictor(input_dir, input_model, stations_json, output_dir, detection_threshold, P_threshold, S_threshold, number_of_plots, plot_mode, loss_weights, loss_types, normalization_mode, batch_size, overlap, gpuid, gpu_limit)
    272             pred_generator = PreLoadGeneratorTest(meta["trace_start_time"], data_set, **params_pred)
    273 
--> 274             predD, predP, predS = model.predict_generator(pred_generator)
    275 
    276             detection_memory = []

/home/malcolmw/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/legacy/interfaces.py in wrapper(*args, **kwargs)
     89                 warnings.warn('Update your `' + object_name + '` call to the ' +
     90                               'Keras 2 API: ' + signature, stacklevel=2)
---> 91             return func(*args, **kwargs)
     92         wrapper._original_function = func
     93         return wrapper

/home/malcolmw/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/training.py in predict_generator(self, generator, steps, callbacks, max_queue_size, workers, use_multiprocessing, verbose)
   1844             workers=workers,
   1845             use_multiprocessing=use_multiprocessing,
-> 1846             verbose=verbose)
   1847 
   1848 

/home/malcolmw/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/training_generator.py in predict_generator(model, generator, steps, callbacks, max_queue_size, workers, use_multiprocessing, verbose)
    445     if steps is None:
    446         if use_sequence_api:
--> 447             steps = len(generator)
    448         else:
    449             raise ValueError('`steps=None` is only valid for a generator'

/home/malcolmw/local/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/mseed_predictor.py in __len__(self)
    470     def __len__(self):
    471         'Denotes the number of batches per epoch'
--> 472         return int(np.floor(len(self.list_IDs) / self.batch_size))
    473 
    474     def __getitem__(self, index):

ZeroDivisionError: division by zero

I am running Ubuntu20.04LTS with the following conda environment:

# Name                    Version                   Build  Channel
_libgcc_mutex             0.1                 conda_forge    conda-forge
_openmp_mutex             4.5                       1_gnu    conda-forge
absl-py                   0.11.0           py37h89c1867_0    conda-forge
argon2-cffi               20.1.0           py37h4abf009_2    conda-forge
astor                     0.8.1              pyh9f0ad1d_0    conda-forge
async_generator           1.10                       py_0    conda-forge
attrs                     20.3.0             pyhd3deb0d_0    conda-forge
backports                 1.0                        py_2    conda-forge
backports.functools_lru_cache 1.6.1                      py_0    conda-forge
binutils_impl_linux-64    2.35.1               h193b22a_1    conda-forge
binutils_linux-64         2.35                hc3fd857_29    conda-forge
bleach                    3.2.1              pyh9f0ad1d_0    conda-forge
brotlipy                  0.7.0           py37hb5d75c8_1001    conda-forge
c-ares                    1.17.1               h36c2ea0_0    conda-forge
ca-certificates           2020.12.5            ha878542_0    conda-forge
cached-property           1.5.1                      py_0    conda-forge
certifi                   2020.12.5        py37h89c1867_0    conda-forge
cffi                      1.14.4           py37hc58025e_1    conda-forge
chardet                   4.0.0            py37h89c1867_0    conda-forge
cryptography              3.3.1            py37h7f0c10b_0    conda-forge
cycler                    0.10.0                     py_2    conda-forge
dbus                      1.13.6               hfdff14a_1    conda-forge
decorator                 4.4.2                      py_0    conda-forge
defusedxml                0.6.0                      py_0    conda-forge
entrypoints               0.3             pyhd8ed1ab_1003    conda-forge
eqtransformer             0.1.57                   py37_0    smousavi05
expat                     2.2.9                he1b5a44_2    conda-forge
fontconfig                2.13.1            h7e3eb15_1002    conda-forge
freetype                  2.10.4               h7ca028e_0    conda-forge
future                    0.18.2           py37h89c1867_2    conda-forge
gast                      0.4.0              pyh9f0ad1d_0    conda-forge
gcc_impl_linux-64         7.5.0               hda68d29_13    conda-forge
gcc_linux-64              7.5.0               he2a3fca_29    conda-forge
gettext                   0.19.8.1          h0b5b191_1005    conda-forge
glib                      2.66.4               hcd2ae1e_1    conda-forge
google-pasta              0.2.0              pyh8c360ce_0    conda-forge
grpcio                    1.34.0           py37hb27c1af_0    conda-forge
gst-plugins-base          1.14.5               h0935bb2_2    conda-forge
gstreamer                 1.18.2               ha23517c_0    conda-forge
gxx_impl_linux-64         7.5.0               h64c220c_13    conda-forge
gxx_linux-64              7.5.0               h547f3ba_29    conda-forge
h5py                      2.10.0          nompi_py37hf7afa78_105    conda-forge
hdf5                      1.10.6          nompi_h7c3c948_1111    conda-forge
icu                       67.1                 he1b5a44_0    conda-forge
idna                      2.10               pyh9f0ad1d_0    conda-forge
importlib-metadata        3.3.0            py37h89c1867_2    conda-forge
importlib_metadata        3.3.0                hd8ed1ab_2    conda-forge
iniconfig                 1.1.1              pyh9f0ad1d_0    conda-forge
ipykernel                 5.4.2            py37h888b3d9_0    conda-forge
ipython                   5.8.0                    py37_1    conda-forge
ipython_genutils          0.2.0                      py_1    conda-forge
ipywidgets                7.5.1              pyh9f0ad1d_1    conda-forge
jeepney                   0.6.0              pyhd8ed1ab_0    conda-forge
jinja2                    2.11.2             pyh9f0ad1d_0    conda-forge
jpeg                      9d                   h36c2ea0_0    conda-forge
jsonschema                3.2.0                      py_2    conda-forge
jupyter                   1.0.0                      py_2    conda-forge
jupyter_client            6.1.7                      py_0    conda-forge
jupyter_console           5.2.0                    py37_1    conda-forge
jupyter_core              4.7.0            py37h89c1867_0    conda-forge
jupyterlab_pygments       0.1.2              pyh9f0ad1d_0    conda-forge
keras                     2.3.1                    py37_0    conda-forge
keras-applications        1.0.8                      py_1    conda-forge
keras-preprocessing       1.1.0                      py_0    conda-forge
kernel-headers_linux-64   2.6.32              h77966d4_13    conda-forge
keyring                   21.5.0           py37h89c1867_0    conda-forge
kiwisolver                1.3.1            py37hc928c03_0    conda-forge
krb5                      1.17.2               h926e7f8_0    conda-forge
lcms2                     2.11                 hcbb858e_1    conda-forge
ld_impl_linux-64          2.35.1               hea4e1c9_1    conda-forge
libblas                   3.9.0                3_openblas    conda-forge
libcblas                  3.9.0                3_openblas    conda-forge
libclang                  11.0.0          default_ha5c780c_2    conda-forge
libcurl                   7.71.1               hcdd3856_8    conda-forge
libedit                   3.1.20191231         he28a2e2_2    conda-forge
libev                     4.33                 h516909a_1    conda-forge
libevent                  2.1.10               hcdb4288_3    conda-forge
libffi                    3.3                  h58526e2_2    conda-forge
libgcc-ng                 9.3.0               h5dbcf3e_17    conda-forge
libgfortran-ng            7.5.0               hae1eefd_17    conda-forge
libgfortran4              7.5.0               hae1eefd_17    conda-forge
libglib                   2.66.4               h164308a_1    conda-forge
libgomp                   9.3.0               h5dbcf3e_17    conda-forge
libgpuarray               0.7.6             h14c3975_1003    conda-forge
libiconv                  1.16                 h516909a_0    conda-forge
liblapack                 3.9.0                3_openblas    conda-forge
libllvm11                 11.0.0               he513fc3_0    conda-forge
libnghttp2                1.41.0               h8cfc5f6_2    conda-forge
libopenblas               0.3.12          pthreads_hb3c22a3_1    conda-forge
libpng                    1.6.37               h21135ba_2    conda-forge
libpq                     12.3                 h255efa7_3    conda-forge
libprotobuf               3.14.0               h780b84a_0    conda-forge
libsodium                 1.0.18               h36c2ea0_1    conda-forge
libssh2                   1.9.0                hab1572f_5    conda-forge
libstdcxx-ng              9.3.0               h2ae2ef3_17    conda-forge
libtiff                   4.2.0                hdc55705_0    conda-forge
libuuid                   2.32.1            h7f98852_1000    conda-forge
libwebp-base              1.1.0                h36c2ea0_3    conda-forge
libxcb                    1.13              h14c3975_1002    conda-forge
libxkbcommon              1.0.3                he3ba5ed_0    conda-forge
libxml2                   2.9.10               h68273f3_2    conda-forge
libxslt                   1.1.33               hf705e74_1    conda-forge
lxml                      4.6.2            py37h77fd288_0    conda-forge
lz4-c                     1.9.2                he1b5a44_3    conda-forge
mako                      1.1.3              pyh9f0ad1d_0    conda-forge
markdown                  3.3.3              pyh9f0ad1d_0    conda-forge
markupsafe                1.1.1            py37hb5d75c8_2    conda-forge
matplotlib                3.3.3            py37h89c1867_0    conda-forge
matplotlib-base           3.3.3            py37h4f6019d_0    conda-forge
mistune                   0.8.4           py37h4abf009_1002    conda-forge
more-itertools            8.6.0              pyhd8ed1ab_0    conda-forge
mysql-common              8.0.22               ha770c72_1    conda-forge
mysql-libs                8.0.22               h1fd7589_1    conda-forge
nbclient                  0.5.1                      py_0    conda-forge
nbconvert                 6.0.7            py37h89c1867_3    conda-forge
nbformat                  5.0.8                      py_0    conda-forge
ncurses                   6.2                  h58526e2_4    conda-forge
nest-asyncio              1.4.3              pyhd8ed1ab_0    conda-forge
notebook                  6.1.5            py37h89c1867_0    conda-forge
nspr                      4.29                 he1b5a44_1    conda-forge
nss                       3.60                 hb5efdd6_0    conda-forge
numpy                     1.19.4           py37haa41c4c_2    conda-forge
obspy                     1.2.2            py37h03ebfcd_0    conda-forge
olefile                   0.46               pyh9f0ad1d_1    conda-forge
openssl                   1.1.1i               h7f98852_0    conda-forge
packaging                 20.8               pyhd3deb0d_0    conda-forge
pandas                    1.1.5            py37hdc94413_0    conda-forge
pandoc                    2.11.3.1             h7f98852_0    conda-forge
pandocfilters             1.4.2                      py_1    conda-forge
pcre                      8.44                 he1b5a44_0    conda-forge
pexpect                   4.8.0              pyh9f0ad1d_2    conda-forge
pickleshare               0.7.5                   py_1003    conda-forge
pillow                    8.0.1            py37h63a5d19_0    conda-forge
pip                       20.3.3             pyhd8ed1ab_0    conda-forge
pkginfo                   1.6.1              pyh9f0ad1d_0    conda-forge
pluggy                    0.13.1           py37he5f6b98_3    conda-forge
prometheus_client         0.9.0              pyhd3deb0d_0    conda-forge
prompt_toolkit            1.0.15                     py_1    conda-forge
protobuf                  3.14.0           py37hcd2ae1e_0    conda-forge
pthread-stubs             0.4               h36c2ea0_1001    conda-forge
ptyprocess                0.6.0                   py_1001    conda-forge
py                        1.10.0             pyhd3deb0d_0    conda-forge
pycparser                 2.20               pyh9f0ad1d_2    conda-forge
pygments                  2.7.3              pyhd8ed1ab_0    conda-forge
pygpu                     0.7.6           py37h161383b_1002    conda-forge
pyopenssl                 20.0.1             pyhd8ed1ab_0    conda-forge
pyparsing                 2.4.7              pyh9f0ad1d_0    conda-forge
pyqt                      5.12.3           py37h89c1867_6    conda-forge
pyqt-impl                 5.12.3           py37he336c9b_6    conda-forge
pyqt5-sip                 4.19.18          py37hcd2ae1e_6    conda-forge
pyqtchart                 5.12             py37he336c9b_6    conda-forge
pyqtwebengine             5.12.1           py37he336c9b_6    conda-forge
pyrsistent                0.17.3           py37h4abf009_1    conda-forge
pysocks                   1.7.1            py37he5f6b98_2    conda-forge
pytest                    6.2.1            py37h89c1867_0    conda-forge
python                    3.7.9           hffdb5ce_0_cpython    conda-forge
python-dateutil           2.8.1                      py_0    conda-forge
python_abi                3.7                     1_cp37m    conda-forge
pytz                      2020.4             pyhd8ed1ab_0    conda-forge
pyyaml                    5.3.1            py37hb5d75c8_1    conda-forge
pyzmq                     20.0.0           py37h5a562af_1    conda-forge
qt                        5.12.9               h763d07f_1    conda-forge
qtconsole                 5.0.1              pyhd8ed1ab_0    conda-forge
qtpy                      1.9.0                      py_0    conda-forge
readline                  8.0                  he28a2e2_2    conda-forge
requests                  2.25.1             pyhd3deb0d_0    conda-forge
scipy                     1.4.1            py37ha3d9a3c_3    conda-forge
secretstorage             3.3.0            py37h89c1867_0    conda-forge
send2trash                1.5.0                      py_0    conda-forge
setuptools                49.6.0           py37he5f6b98_2    conda-forge
simplegeneric             0.8.1                      py_1    conda-forge
six                       1.15.0             pyh9f0ad1d_0    conda-forge
sqlalchemy                1.3.22           py37h5e8e339_0    conda-forge
sqlite                    3.34.0               h74cdb3f_0    conda-forge
sysroot_linux-64          2.12                h77966d4_13    conda-forge
tensorboard               1.14.0                   py37_0    conda-forge
tensorflow                1.14.0               h4531e10_0    conda-forge
tensorflow-base           1.14.0           py37h4531e10_0    conda-forge
tensorflow-estimator      1.14.0           py37h5ca1d4c_0    conda-forge
termcolor                 1.1.0                      py_2    conda-forge
terminado                 0.9.1            py37h89c1867_1    conda-forge
testpath                  0.4.4                      py_0    conda-forge
theano                    1.0.5            py37h745909e_1    conda-forge
tk                        8.6.10               h21135ba_1    conda-forge
toml                      0.10.2             pyhd8ed1ab_0    conda-forge
tornado                   6.1              py37h4abf009_0    conda-forge
tqdm                      4.54.1             pyhd8ed1ab_1    conda-forge
traitlets                 5.0.5                      py_0    conda-forge
typing_extensions         3.7.4.3                    py_0    conda-forge
urllib3                   1.26.2             pyhd8ed1ab_0    conda-forge
wcwidth                   0.2.5              pyh9f0ad1d_2    conda-forge
webencodings              0.5.1                      py_1    conda-forge
werkzeug                  1.0.1              pyh9f0ad1d_0    conda-forge
wheel                     0.36.2             pyhd3deb0d_0    conda-forge
widgetsnbextension        3.5.1            py37h89c1867_4    conda-forge
wrapt                     1.12.1           py37h4abf009_2    conda-forge
xorg-libxau               1.0.9                h14c3975_0    conda-forge
xorg-libxdmcp             1.1.3                h516909a_0    conda-forge
xz                        5.2.5                h516909a_1    conda-forge
yaml                      0.2.5                h516909a_0    conda-forge
zeromq                    4.3.3                h58526e2_3    conda-forge
zipp                      3.4.0                      py_0    conda-forge
zlib                      1.2.11            h516909a_1010    conda-forge
zstd                      1.4.5                h6597ccf_2    conda-forge

Got Error in mseed_predictor (list index out of range)

Windows 10
Python 3.7.3
Anaconda eqt env
Keras == 2.4.2
TensorFlow == 2.3.0

I got error when running mseed_predictor, maybe it because the different system or standard on folder path in windows. My mseed data is in F:\EQTransformer-master\examples\downloads_mseeds\ME03, same as the example.
I'm using Windows 10 with anaconda and eqt environment.
The code:

from EQTransformer.core.mseed_predictor import mseed_predictor
mseed_predictor(input_dir='downloads_mseeds',   
         input_model='../ModelsAndSampleData/EqT_model.h5',
         stations_json='station_list.json',
         output_dir=detections2',
         loss_weights=[0.02, 0.40, 0.58],          
         detection_threshold=0.3,                
         P_threshold=0.1,
         S_threshold=0.1, 
         number_of_plots=10,
         plot_mode='time_frequency',
         normalization_mode='std',
         batch_size=500,
         overlap=0.3,
         gpuid=None,
         gpu_limit=None) 

Here's the error message:

 *** Finished the prediction in: 0 hours and 2 minutes and 43.16 seconds.
 *** Detected: 0 events.
 *** Wrote the results into --> " F:\EQTransformer-master\examples\detections2\ME02_outputs "
========= Started working on ME03, 3 out of 44 ...
20131001T000000Z__20131002T000000Z.mseed

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
<ipython-input-1-7986e25b3c9e> in <module>
     15          overlap=0.3,
     16          gpuid=None,
---> 17          gpu_limit=None) 

C:\ProgramData\Anaconda3\lib\site-packages\EQTransformer\core\mseed_predictor.py in mseed_predictor(input_dir, input_model, stations_json, output_dir, detection_threshold, P_threshold, S_threshold, number_of_plots, plot_mode, loss_weights, loss_types, normalization_mode, batch_size, overlap, gpuid, gpu_limit)
    283                     post_write = len(detection_memory)
    284                     if plt_n < args['number_of_plots'] and post_write > pre_write:
--> 285                         _plotter_prediction(data_set[meta["trace_start_time"][ix]], args, save_figs, predD[ix][:, 0], predP[ix][:, 0], predS[ix][:, 0], meta["trace_start_time"][ix], matches)
    286                         plt_n += 1
    287 

C:\ProgramData\Anaconda3\lib\site-packages\EQTransformer\core\mseed_predictor.py in _plotter_prediction(data, args, save_figs, yh1, yh2, yh3, evi, matches)
   1050         plt.xlim(0, 6000)
   1051         x = np.arange(6000)
-> 1052         plt.title(save_figs.split('/')[-2].split('_')[0]+':'+str(evi))
   1053 
   1054         ax.set_xticks([])

IndexError: list index out of range

plot_helicorder if mseed contains gap

Dear @smousavi05

In case our data (mseed file) has time gap, the plot_helicorder function does not show the whole stream since it plots only the first trace of stream:
[Line 68 of file plot.py]
st[0].plot(type='dayplot', color=['k'],interval=60, events=event_list, outfile=input_mseed.split("\\")[-1].split('.mseed')[0]+'.png')
I would suggest using "st" instead of "st[0]" because each time we call the "plot_helicorder", we pass one mseed file as an argument, so the final plot shows the whole stream including all gaps if exists.

Regards

Model Retraining Data Sampling Rate

Hello Mustafa,

I have written a code that will take the downloaded mseed files from EQTransformer, convert them to sac, splice them into 60-second intervals with a consistent 7-second buffer before a documented p-wave, pull the attributes out of legacy data from our by-hand system (SEISMOS), and write the hdf5 and csv file for retraining. However, while looking through your paper on STEAD format, it looks like you changed the sample rate for each station to 100 Hz for an exact 6000 samples per _EV object. Can EQTransformer work with data with other sampling rates (12000 samples) or do I need to make this conversion?

Also, the preprocessor function EQTransformer also appears to split the mseed data into 60-second intervals but it does not call on the training model the user provides to the trainer function. Could this introduce an issue with the index values used to assign the phase labels starting from different points, the point that I cut my SAC files at, and the point that the preprocessor cut the mseed values from? If so, should I skip the preprocessor step in the EQTransformer workflow?

I believe SEISMOS is a pretty archaic system that is not used much anymore, but I am planning on publishing my code as a package in R for converting SEISMOS data sets into EQTransformer training models. Hopefully, this could make the transition between systems more streamlined in the future for others. If you have any comments on the matter I would love t0o have your input.

Best,
Bryan

The picker_loss decreases during training, but the F1 index does not increase

Thank you very much for your work, he is instructive to me
I have a problem here. I want to fully reproduce your work. I use the complete Stanford EArthquake Dataset (STEAD) and use the hyperparameters in the Example to train the model in the repo.
However, during the training process, I found that the overall Loss is decreasing(Including p_picker_loss and s_picker_loss), while the F1 index has been declining (p_picker_f1 and s_picker_f1), even with the decrease of Loss, the F1 index has also decreased.
By the way, detector_f1 will increase as detector_loss decreases.
I haven't changed any code or parameters. What is the problem?

Such as:
loss: 0.0989 - detector_loss: 0.2477 - picker_P_loss: 0.0592 - picker_S_loss: 0.0643 -
detector_f1: 0.7590 - picker_P_f1: 3.2014e-04 - picker_S_f1: 3.5400e-04

Unable to download data from IRIS

Dear Developers,
I wanted to test the examples with my stations that are shared at IRIS, however I am not able to download the data.
Here are the parts of the code:

#First Part
from EQTransformer.utils.downloader import makeStationList
from EQTransformer.utils.downloader import downloadMseeds
import os

result First Part-->OK

#Second Part
json_basepath = os.path.join(os.getcwd(),"json/station_list.json")
print(json_basepath)

makeStationList(json_path=json_basepath,
                client_list=["IRIS"], min_lat= -23.41, max_lat=-9.29, min_lon=-69.81, 
                max_lon=-57.24, start_time="2021-02-01 00:00:00.00",
                end_time="2021-02-01 03:00:00.00", channel_list=["BH[ZNE]", "SH[Z]"],
                filter_network=["1U", "C", "C1", "GE","IU", "SY","XX"], filter_station=[])

result SecondPart --> OK

#ThirdPart
downloadMseeds(client_list=["IRIS"], stations_json=json_basepath, 
               output_dir="/home/tonino/EQTransformer/downloads_mseeds", 
               start_time="2021-02-01 00:00:00.00",
               end_time="2021-02-01 03:00:00.00", min_lat=-23.41, max_lat=-9.29,
               min_lon=-69.81, max_lon=-57.24, chunk_size=1,
               channel_list=["BH[ZNE]", "SH[Z]", "EH[ZNE]"], n_processor=1

resultsThirdPart --> half OK.

[2021-02-24 21:23:10,483] - obspy.clients.fdsn.mass_downloader - INFO: Initializing FDSN client(s) for IRIS.
[2021-02-24 21:23:10,491] - obspy.clients.fdsn.mass_downloader - INFO: Successfully initialized 1 client(s): IRIS.

####### There are 3 stations in the list. #######
======= Working on REDDE station.
======= Working on SOEP station.
======= Working on LPAZ station.

It seems ok, however the data is not stored at downloads_mseeds folder, it keeps empty. I tested with a simple obspy script and the data is available.
Thank you in advance.
Tonino

problem with event association

Dear @smousavi05

I have a problem with "moving_window" parameter in event association. I started setting it to 10s and i found some picks arrives within first 10th second of a minute will not be associated to an event. For example, the following are rows from Y2000.pha describing an event:

2018 811144614.3629 6.36 51 7.56 5.000.00
KOLO BP HHE 2018 8111446 0.00 12.87ES 0
KOLO BP HHZ IP 02018 811144610.33 0.00 0
DLRM BP HHE 2018 8111446 0.00 14.17ES 0
DLRM BP HHZ IP 02018 811144610.85 0.00 0
TOLS BP HHE 2018 8111446 0.00 16.03ES 0
TOLS BP HHZ IP 12018 811144611.94 0.00 0
ABTL BP HHE 2018 8111446 0.00 20.41ES 0
ABTL BP HHZ IP 02018 811144614.28 0.00 0
SHIF BP HHE 2018 8111446 0.00 21.85ES 1
SHIF BP HHZ IP 02018 811144615.04 0.00 0
NOKA BP HHE 2018 8111446 0.00 23.20ES 0
NOKA BP HHZ IP 02018 811144615.98 0.00 0

since there were some stations absent, i've decided to check "X_prediction_results.csv" inside "detection" folder for those stations which not associated to that event. The following are rows according to that event which were not associated:

KHIA_BP_BH_2018-08-11T14:45:30.000000Z BP KHIA BH 28.861 51.181 48 2018-08-11 14:46:09.960000 2018-08-11 14:46:15.460000 0.99 2018-08-11 14:46:10 0.79 43 2018-08-11 14:46:12.250000 0.85 11.5
JNAK_BP_BH_2018-08-11T14:45:30.000000Z BP JNAK BH 28.879 51.69 16 2018-08-11 14:46:09.320000 2018-08-11 14:46:13.680000 0.99 2018-08-11 14:46:09.320000 0.84 41.3 2018-08-11 14:46:11.170000 0.79 9.7

While the above picks have high probability for detection, p and s picks, but not associated to the event. One solution would be increasing the "moving_window" parameter but i don't think that could guarantee for subsequent events and phases where lie around (let say) 15 seconds and so on!

A typo in variable name in the tutorial

There is a typo on the variable name in tutorial https://eqtransformer.readthedocs.io/en/latest/tutorial.html

from EQTransformer.utils.downloader import downloadMseeds

downloadMseeds(client_list=["SCEDC", "IRIS"], stations_json='station_list.json', 
output_dir="downloads_mseeds", min_lat=35.50, max_lat=35.60, min_lon=-117.80, 
max_lon=-117.40, start_time="2019-09-01 00:00:00.00", end_time="2019-09-03 00:00:00.00", 
chunck_size=1, channel_list=[], n_processor=2)

Here the variable name chunck_size should be chunk_size istead

Problems running EQTRansform (preprocessor, mseed_predictor)

Hi!

I wanted to try this interesting code but I hit a problem...
I am running the code installed with conda on MacOS

I have put my mseed files in a folder like this:
mseed/

  • sta1
    -file1
    -file2
    -file3
  • sta2
    -file1
    -file2
    -file3
  • sta3
    -file1
    -file2
    -file3

I have generated .json file in a same way it is generated if you try to download the example data.

When I run preprocessor I get the following error:

~/anaconda3/envs/qtt/lib/python3.7/site-packages/EQTransformer/utils/hdf5_maker.py in (.0)
102
103 file_list = [join(station, ev) for ev in listdir(station) if ev.split('/')[-1] != '.DS_Store'];
--> 104 mon = [ev.split('')[1]+''+ev.split('__')[2] for ev in file_list ];
105 uni_list = list(set(mon))
106 uni_list.sort()
IndexError: list index out of range

When I want to run mseed_predictor I get the following error:

*** Loading is complete!
######### There are files for 4 stations in mseed directory. #########
========= Started working on CEY, 1 out of 4 ...

~/anaconda3/envs/qtt/lib/python3.7/site-packages/EQTransformer/core/mseed_predictor.py in (.0)
258
259 file_list = [join(st, ev) for ev in listdir(args['input_dir']+'/'+st) if ev.split('/')[-1].split('.')[-1].lower() == 'mseed'];
--> 260 mon = [ev.split('')[1]+''+ev.split('__')[2] for ev in file_list ];
261 uni_list = list(set(mon))
262 uni_list.sort()
IndexError: list index out of range

Thank you for your time, code and help!
Cheers

cannot import name 'multi_gpu_model' from keras

Hi Mostafa,
I am getting the following error when trying to use the downloader (I downloading continous data) example:

from EQTransformer.utils.downloader import makeStationList, downloadMseeds

---------------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-1-f8474247e3c8> in <module>
----> 1 from EQTransformer.utils.downloader import makeStationList, downloadMseeds

~/anaconda3/envs/eqt/lib/python3.6/site-packages/EQTransformer/__init__.py in <module>
      5 import warnings
      6 
----> 7 from EQTransformer.core.trainer import trainer
      8 from EQTransformer.core.tester import tester
      9 from EQTransformer.core.predictor import predictor

~/anaconda3/envs/eqt/lib/python3.6/site-packages/EQTransformer/core/__init__.py in <module>
      2 name='core'
      3 
----> 4 from .EqT_utils import *
      5 from .trainer import trainer
      6 from .tester import tester

~/anaconda3/envs/eqt/lib/python3.6/site-packages/EQTransformer/core/EqT_utils.py in <module>
     18 from keras.layers import MaxPooling1D, UpSampling1D, Cropping1D, SpatialDropout1D, Bidirectional, BatchNormalization
     19 from keras.models import Model
---> 20 from keras.utils import multi_gpu_model
     21 from keras.optimizers import Adam
     22 from obspy.signal.trigger import trigger_onset

ImportError: cannot import name 'multi_gpu_model'

it seems that the keras package works in lines 18 and 19 but not in line 20.
Do you have an idea why this happens?

conda install

Dear developers,
I was trying to run EQtransformer within conda environment, however I was not able to install it, as you mentioned I tried more than 30 times to run the following line with no success
conda install -c smousavi05 eqtransformer

I got the error:

Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: / 
Found conflicts! Looking for incompatible packages.
This can take several minutes.  Press CTRL-C to abort.
failed                                                                                                                   
UnsatisfiableError: The following specifications were found to be incompatible with each other:
Output in format: Requested package -> Available versions

Thank you in advance.
Tonino

associator sqlite3 error

Hi! Thanks for this awesome code. I had it up and running with my data in 30 minutes. Incredibly easy compared to other ML detection tools I have tried using in the past 6 months.

I have run into one error though when trying run the associator. Running on linux with Python 3.7.9. I installed EQT using anaconda.

When running the associator with the following:

  import shutil
  import os
  from EQTransformer.utils.associator import run_associator

  out_dir="association"
  try:
      shutil.rmtree(out_dir)
  except Exception:
      pass
  os.makedirs(out_dir)

  run_associator(input_dir="detections", 
      start_time="2020-03-31 00:00:00.00", 
      end_time="2020-04-03 00:00:00.00", 
      moving_window=15, 
      pair_n=3,
      output_dir=out_dir,
      consider_combination=False)

I get this error.

Traceback (most recent call last):
  File "main.py", line 120, in <module>
    consider_combination=False)
  File "/home/dylanmikesell/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/utils/associator.py", line 106, in run_associator
    )''')
sqlite3.OperationalError: database is locked

Is the associator looking for a sqlite DB? If so, is there some step I am missing to create this DB? I cannot find anything about it in the docs. This error on line 106 also doesn't make sense given the code on github, so I wonder if the version I installed via conda is old. Thoughts?

Thanks!

Can I not draw

I guess if I do not draw the pictures, the program can run faster. but I do not know how to not draw.
Thank you!

Error in running the example

Platform: mac OS 11.1 & python3.7

I follow the readme of master branch to install the EQT (with conda).
Then I try to run the example in the tutorial: https://eqtransformer.readthedocs.io/en/latest/tutorial.html

But, I find two errors when using the predictor.

The first one is that:

>>> predictor(input_dir= 'downloads_mseeds_processed_hdfs', input_model='EqT_model.h5', output_dir='detections', detection_threshold=0.3, P_threshold=0.1, S_threshold=0.1, number_of_plots=100, plot_mode='time')
============================================================================
Running EqTransformer  0.1.56
 *** Loading the model ...
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/predictor.py", line 227, in predictor
    'f1': f1
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 492, in load_wrapper
    return load_function(*args, **kwargs)
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 583, in load_model
    with H5Dict(filepath, mode='r') as h5dict:
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/utils/io_utils.py", line 191, in __init__
    self.data = h5py.File(path, mode=mode)
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/files.py", line 427, in __init__
    swmr=swmr)
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/h5py/_hl/files.py", line 190, in make_fid
    fid = h5f.open(name, flags, fapl=fapl)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py/h5f.pyx", line 96, in h5py.h5f.open
OSError: Unable to open file (unable to open file: name = 'EqT_model.h5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

I fix this by myself, but I cannot solve the the second one:

>>> predictor(input_dir= 'downloads_mseeds_processed_hdfs', input_model='EQTransformer/ModelsAndSampleData/EqT_model.h5', output_dir='detections', detection_threshold=0.3, P_threshold=0.1, S_threshold=0.1, number_of_plots=100, plot_mode='time')
============================================================================
Running EqTransformer  0.1.56
 *** Loading the model ...
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/EQTransformer/core/predictor.py", line 227, in predictor
    'f1': f1
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 492, in load_wrapper
    return load_function(*args, **kwargs)
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 584, in load_model
    model = _deserialize_model(h5dict, custom_objects, compile)
  File "/Users/liang/local/anaconda3/envs/eqt/lib/python3.7/site-packages/keras/engine/saving.py", line 273, in _deserialize_model
    model_config = json.loads(model_config.decode('utf-8'))
AttributeError: 'str' object has no attribute 'decode'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.