Giter Site home page Giter Site logo

nf-core / chipseq Goto Github PK

View Code? Open in Web Editor NEW
171.0 139.0 137.0 15.67 MB

ChIP-seq peak-calling, QC and differential analysis pipeline.

Home Page: https://nf-co.re/chipseq

License: MIT License

HTML 0.94% R 11.16% Perl 1.54% Python 11.15% Nextflow 58.25% Groovy 16.96%
nf-core nextflow workflow chip-seq chromatin-immunoprecipitation peak-calling chip pipeline macs2

chipseq's People

Contributors

apeltzer avatar bjlang avatar chuan-wang avatar drewjbeh avatar drpatelh avatar ewels avatar joseespinosa avatar kevinmenden avatar mashehu avatar maxulysse avatar nf-core-bot avatar robsyme avatar rotholandus avatar sofiahag avatar tiagochst avatar winni2k avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chipseq's Issues

Strand cross-correlation analysis before, or after de-duplication?

Moved from SciLifeLab#40 (by @sifakise):


According to this publication:

"The evaluation of ChIP quality following duplicate removal may therefore underestimate the extent of ChIP enrichment relative to background..." ... "We recommend the assessment of RSC and NSC prior to blacklisting or duplicate removal..."

Thus, perhaps, steps phantompeakqualtools and calculateNSCRSC could be performed on the non-deduplicated data, instead of the deduplicated?

phantompeakqualtools doesn't run on one sample

I know this isn't quite your wheelhouse, but for some reason, one of my samples doesn't finish the phantompeakqualtools step, but no useful error is given. Can you provide some guidance as to how I can figure out what is wrong with the sample that's causing phantompeakqualtools to fail.

Log is below.

Mar-08 14:11:01.718 [main] DEBUG nextflow.cli.Launcher - Setting http proxy: [dtn04-e0, 3128]
Mar-08 14:11:01.779 [main] DEBUG nextflow.cli.Launcher - Setting https proxy: [dtn04-e0, 3128]
Mar-08 14:11:01.780 [main] DEBUG nextflow.cli.Launcher - $> /usr/local/apps/nextflow/0.30.2/bin/nextflow run /data/capaldobj/nf-core/lgcp/chipseq/ -resume --singleEnd --reads /data/capaldobj/CS02314*-ChIP-seq/Sample_2018_0*/*_R1_*.fastq.gz --macsconfig /data/capaldobj/CS02314X-ChIP-seq-results/macs.config --saturation --genome GRCh37 --outdir /data/capaldobj/CS02314X-ChIP-seq-results/pipeline-output_narrow/ -profile biowulf
Mar-08 14:11:01.869 [main] INFO  nextflow.cli.CmdRun - N E X T F L O W  ~  version 0.30.2
Mar-08 14:11:02.561 [main] INFO  nextflow.cli.CmdRun - Launching `/data/capaldobj/nf-core/lgcp/chipseq/main.nf` [sleepy_fourier] - revision: 4057dba1c8
Mar-08 14:11:02.580 [main] DEBUG nextflow.config.ConfigBuilder - Found config base: /data/capaldobj/nf-core/lgcp/chipseq/nextflow.config
Mar-08 14:11:02.581 [main] DEBUG nextflow.config.ConfigBuilder - Parsing config file: /data/capaldobj/nf-core/lgcp/chipseq/nextflow.config
Mar-08 14:11:02.593 [main] DEBUG nextflow.config.ConfigBuilder - Applying config profile: `biowulf`
Mar-08 14:11:03.063 [main] DEBUG nextflow.config.ConfigBuilder - Available config profiles: [standard, uppmax_devel, biowulf, test, conda, singularity, uppmax, none, aws, uppmax_modules, docker]
Mar-08 14:11:03.108 [main] DEBUG nextflow.Session - Session uuid: c951df69-ddef-4ac7-9d6d-95439fc8cb2a
Mar-08 14:11:03.109 [main] DEBUG nextflow.Session - Run name: sleepy_fourier
Mar-08 14:11:03.109 [main] DEBUG nextflow.Session - Executor pool size: 4
Mar-08 14:11:03.123 [main] DEBUG nextflow.cli.CmdRun - 
  Version: 0.30.2 build 4867
  Modified: 16-06-2018 17:49 UTC (13:49 EDT)
  System: Linux 3.10.0-862.14.4.el7.x86_64
  Runtime: Groovy 2.4.15 on OpenJDK 64-Bit Server VM 1.8.0_181-b13
  Encoding: UTF-8 (UTF-8)
  Process: 53741@cn2342 [10.2.5.194]
  CPUs: 4 - Mem: 251.6 GB (45.3 GB) - Swap: 2 GB (2 GB)
Mar-08 14:11:03.169 [main] DEBUG nextflow.Session - Work-dir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work [gpfs]
Mar-08 14:11:03.716 [main] DEBUG nextflow.Session - Session start invoked
Mar-08 14:11:03.722 [main] DEBUG nextflow.processor.TaskDispatcher - Dispatcher > start
Mar-08 14:11:03.723 [main] DEBUG nextflow.trace.TraceFileObserver - Flow starting -- trace file: /data/capaldobj/CS02314X-ChIP-seq-results/pipeline-output_narrow/pipeline_info/nfcore-chipseq_trace.txt
Mar-08 14:11:03.735 [main] DEBUG nextflow.script.ScriptRunner - > Script parsing
Mar-08 14:11:04.425 [main] DEBUG nextflow.script.ScriptRunner - > Launching execution
Mar-08 14:11:04.524 [main] DEBUG nextflow.Channel - files for syntax: glob; folder: /data/capaldobj/; pattern: CS02314*-ChIP-seq/Sample_2018_0*/*_R1_*.fastq.gz; options: [:]
Mar-08 14:11:04.572 [main] INFO  nextflow.Nextflow - =======================================================
                                          ,--./,-.
          ___     __   __   __   ___     /,-._.--~'
    |\ | |__  __ /  ` /  \ |__) |__         }  {
    | \| |       \__, \__/ |  \ |___     \`-._,-`-,
                                          `._,._,'

 nf-core/chipseq : ChIP-Seq Best Practice v1.0dev
=======================================================
Mar-08 14:11:04.576 [main] INFO  nextflow.Nextflow - Run Name             : sleepy_fourier
Reads                : /data/capaldobj/CS02314*-ChIP-seq/Sample_2018_0*/*_R1_*.fastq.gz
Data Type            : Single-End
Genome               : GRCh37
BWA Index            : /fdb/igenomes//Homo_sapiens/Ensembl/GRCh37/Sequence/BWAIndex/
GTF File             : /fdb/igenomes//Homo_sapiens/Ensembl/GRCh37/Annotation/Genes/genes.gtf
Multiple alignments  : false
MACS Config          : /data/capaldobj/CS02314X-ChIP-seq-results/macs.config
Saturation analysis  : true
MACS broad peaks     : false
Blacklist filtering  : false
Extend Reads         : 100 bp
Container            : [:]
Current home         : /home/capaldobj
Current user         : capaldobj
Current path         : /data/capaldobj/CS02314X-ChIP-seq-results
Working dir          : /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work
Output dir           : /data/capaldobj/CS02314X-ChIP-seq-results/pipeline-output_narrow/
R libraries          : false
Script dir           : /data/capaldobj/nf-core/lgcp/chipseq
Save Reference       : true
Save Trimmed         : true
Save Intermeds       : true
Trim R1              : 0
Trim R2              : 0
Trim 3' R1           : 0
Trim 3' R2           : 0
Config Profile       : biowulf
Mar-08 14:11:04.579 [main] INFO  nextflow.Nextflow - ====================================
Mar-08 14:11:04.670 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:fastqc` matches process fastqc
Mar-08 14:11:04.674 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:04.674 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:04.685 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:04.688 [main] INFO  nextflow.executor.Executor - [warm up] executor > slurm
Mar-08 14:11:04.696 [main] DEBUG n.processor.TaskPollingMonitor - Creating task monitor for executor 'slurm' > capacity: 100; pollInterval: 5s; dumpInterval: 5m 
Mar-08 14:11:04.699 [main] DEBUG nextflow.processor.TaskDispatcher - Starting monitor: TaskPollingMonitor
Mar-08 14:11:04.700 [main] DEBUG n.processor.TaskPollingMonitor - >>> barrier register (monitor: slurm)
Mar-08 14:11:04.716 [main] DEBUG nextflow.executor.Executor - Invoke register for executor: slurm
Mar-08 14:11:04.717 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:04.750 [main] DEBUG nextflow.Session - >>> barrier register (process: fastqc)
Mar-08 14:11:04.752 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > fastqc -- maxForks: 4
Mar-08 14:11:04.774 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:trim_galore` matches process trim_galore
Mar-08 14:11:04.776 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:04.776 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:04.777 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:04.777 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:04.778 [main] DEBUG nextflow.Session - >>> barrier register (process: trim_galore)
Mar-08 14:11:04.779 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > trim_galore -- maxForks: 4
Mar-08 14:11:04.801 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:bwa` matches process bwa
Mar-08 14:11:04.802 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:04.802 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:04.802 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:04.803 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:04.803 [main] DEBUG nextflow.Session - >>> barrier register (process: bwa)
Mar-08 14:11:04.804 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > bwa -- maxForks: 4
Mar-08 14:11:04.817 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:samtools` matches process samtools
Mar-08 14:11:04.818 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:04.819 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:04.819 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:04.819 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:04.820 [main] DEBUG nextflow.Session - >>> barrier register (process: samtools)
Mar-08 14:11:04.820 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > samtools -- maxForks: 4
Mar-08 14:11:04.853 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:bwa_mapped` matches process bwa_mapped
Mar-08 14:11:04.854 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:04.854 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:04.854 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:04.855 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:04.855 [main] DEBUG nextflow.Session - >>> barrier register (process: bwa_mapped)
Mar-08 14:11:04.856 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > bwa_mapped -- maxForks: 4
Mar-08 14:11:04.886 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:picard` matches process picard
Mar-08 14:11:04.887 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:04.888 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:04.888 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:04.888 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:04.889 [main] DEBUG nextflow.Session - >>> barrier register (process: picard)
Mar-08 14:11:04.889 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > picard -- maxForks: 4
Mar-08 14:11:05.002 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:countstat` matches process countstat
Mar-08 14:11:05.005 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.005 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.006 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.006 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.007 [main] DEBUG nextflow.Session - >>> barrier register (process: countstat)
Mar-08 14:11:05.012 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > countstat -- maxForks: 4
Mar-08 14:11:05.047 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.048 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.048 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.048 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.049 [main] DEBUG nextflow.Session - >>> barrier register (process: phantompeakqualtools)
Mar-08 14:11:05.050 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > phantompeakqualtools -- maxForks: 4
Mar-08 14:11:05.073 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.073 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.073 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.074 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.074 [main] DEBUG nextflow.Session - >>> barrier register (process: calculateNSCRSC)
Mar-08 14:11:05.075 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > calculateNSCRSC -- maxForks: 4
Mar-08 14:11:05.091 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.092 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.092 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.092 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.098 [main] DEBUG nextflow.Session - >>> barrier register (process: deepTools)
Mar-08 14:11:05.100 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > deepTools -- maxForks: 4
Mar-08 14:11:05.117 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:ngsplot` matches process ngsplot
Mar-08 14:11:05.118 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.118 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.118 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.119 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.120 [main] DEBUG nextflow.Session - >>> barrier register (process: ngsplot)
Mar-08 14:11:05.121 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > ngsplot -- maxForks: 4
Mar-08 14:11:05.127 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:macs` matches process macs
Mar-08 14:11:05.132 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.132 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.132 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.140 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.141 [main] DEBUG nextflow.Session - >>> barrier register (process: macs)
Mar-08 14:11:05.177 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > macs -- maxForks: 4
Mar-08 14:11:05.214 [Actor Thread 14] INFO  nextflow.processor.TaskProcessor - [12/c8ab36] Cached process > fastqc (2018_051_S5_R1_001)
Mar-08 14:11:05.215 [Actor Thread 4] INFO  nextflow.processor.TaskProcessor - [d5/5c942e] Cached process > fastqc (2018_054_S7_R1_001)
Mar-08 14:11:05.214 [Actor Thread 2] INFO  nextflow.processor.TaskProcessor - [b3/12a14a] Cached process > trim_galore (2018_042_S9_R1_001)
Mar-08 14:11:05.215 [Actor Thread 9] INFO  nextflow.processor.TaskProcessor - [af/923766] Cached process > fastqc (2018_046_S1_R1_001)
Mar-08 14:11:05.223 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:saturation` matches process saturation
Mar-08 14:11:05.224 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.224 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.224 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.224 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.225 [main] DEBUG nextflow.Session - >>> barrier register (process: saturation)
Mar-08 14:11:05.228 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [0e/b610e0] Cached process > fastqc (2018_042_S9_R1_001)
Mar-08 14:11:05.228 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [51/bec24a] Cached process > trim_galore (2018_046_S1_R1_001)
Mar-08 14:11:05.230 [main] DEBUG nextflow.processor.TaskProcessor - Creating *combiner* operator for each param(s) at index(es): [3]
Mar-08 14:11:05.235 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [fc/e0a541] Cached process > trim_galore (2018_051_S5_R1_001)
Mar-08 14:11:05.238 [Actor Thread 12] INFO  nextflow.processor.TaskProcessor - [d1/69722e] Cached process > trim_galore (2018_054_S7_R1_001)
Mar-08 14:11:05.242 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > saturation -- maxForks: 4
Mar-08 14:11:05.248 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.249 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.249 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.249 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.266 [main] DEBUG nextflow.Session - >>> barrier register (process: saturation_r)
Mar-08 14:11:05.266 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > saturation_r -- maxForks: 4
Mar-08 14:11:05.280 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.285 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.285 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.286 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.286 [main] DEBUG nextflow.Session - >>> barrier register (process: chippeakanno)
Mar-08 14:11:05.286 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > chippeakanno -- maxForks: 4
Mar-08 14:11:05.342 [main] DEBUG nextflow.util.CacheHelper - Config settings `withName:multiqc` matches process multiqc
Mar-08 14:11:05.344 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.345 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.348 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.349 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.360 [Actor Thread 9] INFO  nextflow.processor.TaskProcessor - [4c/c53e62] Cached process > fastqc (2018_043_S6_R1_001)
Mar-08 14:11:05.372 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [0a/b1cf83] Cached process > fastqc (2018_038_S2_R1_001)
Mar-08 14:11:05.374 [main] DEBUG nextflow.Session - >>> barrier register (process: multiqc)
Mar-08 14:11:05.375 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > multiqc -- maxForks: 4
Mar-08 14:11:05.379 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [4a/46573d] Cached process > trim_galore (2018_038_S2_R1_001)
Mar-08 14:11:05.383 [Actor Thread 17] INFO  nextflow.processor.TaskProcessor - [88/0e792b] Cached process > bwa (2018_042_S9_R1_001)
Mar-08 14:11:05.387 [Actor Thread 6] INFO  nextflow.processor.TaskProcessor - [3a/642c2f] Cached process > fastqc (2018_040_S4_R1_001)
Mar-08 14:11:05.398 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [ff/95d137] Cached process > trim_galore (2018_043_S6_R1_001)
Mar-08 14:11:05.402 [Actor Thread 14] INFO  nextflow.processor.TaskProcessor - [2a/40d017] Cached process > fastqc (2018_037_S1_R1_001)
Mar-08 14:11:05.415 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [33/5d62b6] Cached process > bwa (2018_046_S1_R1_001)
Mar-08 14:11:05.416 [Actor Thread 2] INFO  nextflow.processor.TaskProcessor - [ee/787a72] Cached process > trim_galore (2018_037_S1_R1_001)
Mar-08 14:11:05.418 [Actor Thread 9] INFO  nextflow.processor.TaskProcessor - [9a/33e02b] Cached process > fastqc (2018_044_S7_R1_001)
Mar-08 14:11:05.418 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [a0/514955] Cached process > bwa (2018_051_S5_R1_001)
Mar-08 14:11:05.420 [Actor Thread 12] INFO  nextflow.processor.TaskProcessor - [91/0db998] Cached process > trim_galore (2018_040_S4_R1_001)
Mar-08 14:11:05.421 [Actor Thread 4] INFO  nextflow.processor.TaskProcessor - [04/cb7f11] Cached process > fastqc (2018_053_S8_R1_001)
Mar-08 14:11:05.435 [Actor Thread 8] INFO  nextflow.processor.TaskProcessor - [ae/f10105] Cached process > bwa (2018_043_S6_R1_001)
Mar-08 14:11:05.443 [Actor Thread 14] INFO  nextflow.processor.TaskProcessor - [6a/8e11a0] Cached process > fastqc (2018_050_S4_R1_001)
Mar-08 14:11:05.447 [Actor Thread 6] INFO  nextflow.processor.TaskProcessor - [1d/1af84a] Cached process > fastqc (2018_048_S9_R1_001)
Mar-08 14:11:05.460 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [1c/b1a633] Cached process > bwa (2018_038_S2_R1_001)
Mar-08 14:11:05.461 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [be/e120a9] Cached process > samtools (2018_042_S9_R1_001)
Mar-08 14:11:05.474 [main] DEBUG nextflow.processor.ProcessFactory - << taskConfig executor: slurm
Mar-08 14:11:05.474 [main] DEBUG nextflow.processor.ProcessFactory - >> processorType: 'slurm'
Mar-08 14:11:05.474 [main] DEBUG nextflow.executor.Executor - Initializing executor: slurm
Mar-08 14:11:05.474 [main] DEBUG n.executor.AbstractGridExecutor - Creating executor 'slurm' > queue-stat-interval: 1m
Mar-08 14:11:05.475 [main] DEBUG nextflow.Session - >>> barrier register (process: output_documentation)
Mar-08 14:11:05.475 [main] DEBUG nextflow.processor.TaskProcessor - Creating operator > output_documentation -- maxForks: 4
Mar-08 14:11:05.480 [main] DEBUG nextflow.script.ScriptRunner - > Await termination 
Mar-08 14:11:05.480 [main] DEBUG nextflow.Session - Session await
Mar-08 14:11:05.481 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [df/ee293d] Cached process > bwa (2018_037_S1_R1_001)
Mar-08 14:11:05.482 [Actor Thread 4] INFO  nextflow.processor.TaskProcessor - [0b/9d1dee] Cached process > fastqc (2018_052_S6_R1_001)
Mar-08 14:11:05.489 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [4b/f26469] Cached process > trim_galore (2018_053_S8_R1_001)
Mar-08 14:11:05.491 [Actor Thread 12] INFO  nextflow.processor.TaskProcessor - [f0/f1c6bd] Cached process > trim_galore (2018_050_S4_R1_001)
Mar-08 14:11:05.491 [Actor Thread 14] INFO  nextflow.processor.TaskProcessor - [9e/f16875] Cached process > fastqc (2018_039_S3_R1_001)
Mar-08 14:11:05.492 [Actor Thread 2] INFO  nextflow.processor.TaskProcessor - [bd/cde3b5] Cached process > trim_galore (2018_048_S9_R1_001)
Mar-08 14:11:05.510 [Actor Thread 20] INFO  nextflow.processor.TaskProcessor - [e6/6aab45] Cached process > trim_galore (2018_044_S7_R1_001)
Mar-08 14:11:05.522 [Actor Thread 6] INFO  nextflow.processor.TaskProcessor - [56/d5cb24] Cached process > fastqc (2018_041_S5_R1_001)
Mar-08 14:11:05.526 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [6f/20aa48] Cached process > samtools (2018_038_S2_R1_001)
Mar-08 14:11:05.531 [Actor Thread 17] INFO  nextflow.processor.TaskProcessor - [67/80b634] Cached process > picard (2018_042_S9_R1_001)
Mar-08 14:11:05.553 [Actor Thread 23] INFO  nextflow.processor.TaskProcessor - [33/86746f] Cached process > samtools (2018_051_S5_R1_001)
Mar-08 14:11:05.558 [Actor Thread 21] INFO  nextflow.processor.TaskProcessor - [40/82fac2] Cached process > fastqc (2018_045_S8_R1_001)
Mar-08 14:11:05.569 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [e5/c08feb] Cached process > bwa (2018_040_S4_R1_001)
Mar-08 14:11:05.570 [Actor Thread 9] INFO  nextflow.processor.TaskProcessor - [0e/3c2a40] Cached process > trim_galore (2018_045_S8_R1_001)
Mar-08 14:11:05.570 [Actor Thread 22] INFO  nextflow.processor.TaskProcessor - [55/c1e2d6] Cached process > samtools (2018_046_S1_R1_001)
Mar-08 14:11:05.581 [Actor Thread 2] INFO  nextflow.processor.TaskProcessor - [8d/ea0cee] Cached process > trim_galore (2018_041_S5_R1_001)
Mar-08 14:11:05.588 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [a5/fd8431] Cached process > picard (2018_038_S2_R1_001)
Mar-08 14:11:05.592 [Actor Thread 12] INFO  nextflow.processor.TaskProcessor - [65/d6f727] Cached process > trim_galore (2018_052_S6_R1_001)
Mar-08 14:11:05.593 [Actor Thread 4] INFO  nextflow.processor.TaskProcessor - [81/7ef8f7] Cached process > fastqc (2018_047_S2_R1_001)
Mar-08 14:11:05.596 [Actor Thread 6] INFO  nextflow.processor.TaskProcessor - [7a/11ba57] Cached process > fastqc (2018_049_S3_R1_001)
Mar-08 14:11:05.607 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [a4/132abc] Cached process > samtools (2018_037_S1_R1_001)
Mar-08 14:11:05.615 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [19/b460e1] Cached process > trim_galore (2018_039_S3_R1_001)
Mar-08 14:11:05.619 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [b2/67e9ef] Cached process > samtools (2018_043_S6_R1_001)
Mar-08 14:11:05.628 [Actor Thread 21] INFO  nextflow.processor.TaskProcessor - [b4/a93c8f] Cached process > bwa (2018_050_S4_R1_001)
Mar-08 14:11:05.632 [Actor Thread 27] INFO  nextflow.processor.TaskProcessor - [10/8eedcd] Cached process > picard (2018_051_S5_R1_001)
Mar-08 14:11:05.635 [Actor Thread 9] INFO  nextflow.processor.TaskProcessor - [a4/4ba007] Cached process > samtools (2018_040_S4_R1_001)
Mar-08 14:11:05.635 [Actor Thread 2] INFO  nextflow.processor.TaskProcessor - [60/33a484] Cached process > trim_galore (2018_049_S3_R1_001)
Mar-08 14:11:05.643 [Actor Thread 24] INFO  nextflow.processor.TaskProcessor - [5b/1eeb12] Cached process > trim_galore (2018_047_S2_R1_001)
Mar-08 14:11:05.648 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [40/2b77cf] Cached process > phantompeakqualtools (2018_038_S2_R1_001)
Mar-08 14:11:05.653 [Actor Thread 7] INFO  nextflow.processor.TaskProcessor - [3a/6c94d4] Cached process > bwa (2018_048_S9_R1_001)
Mar-08 14:11:05.654 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [0a/e69207] Cached process > picard (2018_046_S1_R1_001)
Mar-08 14:11:05.657 [Actor Thread 5] INFO  nextflow.processor.TaskProcessor - [79/74af81] Cached process > bwa (2018_044_S7_R1_001)
Mar-08 14:11:05.661 [Actor Thread 23] INFO  nextflow.processor.TaskProcessor - [81/c7fe38] Cached process > bwa (2018_053_S8_R1_001)
Mar-08 14:11:05.667 [Actor Thread 22] INFO  nextflow.processor.TaskProcessor - [9d/548e43] Cached process > picard (2018_037_S1_R1_001)
Mar-08 14:11:05.677 [Actor Thread 2] INFO  nextflow.processor.TaskProcessor - [50/3c6d4d] Cached process > samtools (2018_048_S9_R1_001)
Mar-08 14:11:05.679 [Actor Thread 17] INFO  nextflow.processor.TaskProcessor - [84/be7a07] Cached process > picard (2018_043_S6_R1_001)
Mar-08 14:11:05.683 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [65/c934e7] Cached process > phantompeakqualtools (2018_051_S5_R1_001)
Mar-08 14:11:05.690 [Actor Thread 27] INFO  nextflow.processor.TaskProcessor - [59/5ce309] Cached process > picard (2018_040_S4_R1_001)
Mar-08 14:11:05.695 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [b9/692121] Cached process > phantompeakqualtools (2018_046_S1_R1_001)
Mar-08 14:11:05.697 [Actor Thread 28] INFO  nextflow.processor.TaskProcessor - [b4/624b89] Cached process > phantompeakqualtools (2018_042_S9_R1_001)
Mar-08 14:11:05.701 [Actor Thread 21] INFO  nextflow.processor.TaskProcessor - [54/3d66ec] Cached process > bwa (2018_045_S8_R1_001)
Mar-08 14:11:05.701 [Actor Thread 5] INFO  nextflow.processor.TaskProcessor - [64/d2c902] Cached process > samtools (2018_044_S7_R1_001)
Mar-08 14:11:05.719 [Actor Thread 9] INFO  nextflow.processor.TaskProcessor - [d7/102b66] Cached process > phantompeakqualtools (2018_037_S1_R1_001)
Mar-08 14:11:05.722 [Actor Thread 7] INFO  nextflow.processor.TaskProcessor - [6e/c3204e] Cached process > bwa (2018_041_S5_R1_001)
Mar-08 14:11:05.728 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [d5/4d2e40] Cached process > bwa (2018_039_S3_R1_001)
Mar-08 14:11:05.728 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [60/19588b] Cached process > picard (2018_048_S9_R1_001)
Mar-08 14:11:05.730 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [1d/24f6fb] Cached process > phantompeakqualtools (2018_043_S6_R1_001)
Mar-08 14:11:05.728 [Actor Thread 23] INFO  nextflow.processor.TaskProcessor - [09/7b45ae] Cached process > samtools (2018_053_S8_R1_001)
Mar-08 14:11:05.734 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [99/5046a9] Cached process > samtools (2018_045_S8_R1_001)
Mar-08 14:11:05.736 [Actor Thread 12] INFO  nextflow.processor.TaskProcessor - [a2/d0c671] Cached process > picard (2018_044_S7_R1_001)
Mar-08 14:11:05.753 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [3d/2eb31c] Cached process > bwa (2018_052_S6_R1_001)
Mar-08 14:11:05.756 [Actor Thread 24] INFO  nextflow.processor.TaskProcessor - [b5/280c18] Cached process > bwa (2018_049_S3_R1_001)
Mar-08 14:11:05.765 [Actor Thread 27] INFO  nextflow.processor.TaskProcessor - [6e/f6bf8d] Cached process > phantompeakqualtools (2018_044_S7_R1_001)
Mar-08 14:11:05.767 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [ce/4d90ce] Cached process > samtools (2018_039_S3_R1_001)
Mar-08 14:11:05.767 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [8c/a5ce02] Cached process > picard (2018_045_S8_R1_001)
Mar-08 14:11:05.770 [Actor Thread 7] INFO  nextflow.processor.TaskProcessor - [85/dd32a2] Cached process > bwa (2018_047_S2_R1_001)
Mar-08 14:11:05.776 [Actor Thread 28] INFO  nextflow.processor.TaskProcessor - [1c/c56739] Cached process > phantompeakqualtools (2018_040_S4_R1_001)
Mar-08 14:11:05.777 [Actor Thread 29] INFO  nextflow.processor.TaskProcessor - [fe/34d517] Cached process > samtools (2018_041_S5_R1_001)
Mar-08 14:11:05.781 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [f3/0fadf0] Cached process > picard (2018_053_S8_R1_001)
Mar-08 14:11:05.790 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [a7/f909b1] Cached process > phantompeakqualtools (2018_045_S8_R1_001)
Mar-08 14:11:05.790 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [3b/a44893] Cached process > picard (2018_039_S3_R1_001)
Mar-08 14:11:05.794 [Actor Thread 21] INFO  nextflow.processor.TaskProcessor - [76/86fbc4] Cached process > samtools (2018_052_S6_R1_001)
Mar-08 14:11:05.795 [Actor Thread 23] INFO  nextflow.processor.TaskProcessor - [f9/dc70b7] Cached process > phantompeakqualtools (2018_048_S9_R1_001)
Mar-08 14:11:05.808 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [ea/2c4d6f] Cached process > samtools (2018_047_S2_R1_001)
Mar-08 14:11:05.812 [Actor Thread 27] INFO  nextflow.processor.TaskProcessor - [4c/343610] Cached process > picard (2018_041_S5_R1_001)
Mar-08 14:11:05.813 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [1f/f1646f] Cached process > samtools (2018_049_S3_R1_001)
Mar-08 14:11:05.852 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [f3/9ab700] Cached process > phantompeakqualtools (2018_039_S3_R1_001)
Mar-08 14:11:05.860 [Actor Thread 28] INFO  nextflow.processor.TaskProcessor - [ab/02c221] Cached process > phantompeakqualtools (2018_041_S5_R1_001)
Mar-08 14:11:05.860 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [79/bae88a] Cached process > picard (2018_047_S2_R1_001)
Mar-08 14:11:05.862 [Actor Thread 27] INFO  nextflow.processor.TaskProcessor - [ed/677b81] Cached process > picard (2018_049_S3_R1_001)
Mar-08 14:11:05.878 [Actor Thread 27] INFO  nextflow.processor.TaskProcessor - [f5/d639d0] Cached process > phantompeakqualtools (2018_049_S3_R1_001)
Mar-08 14:11:05.880 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [34/2f35b1] Cached process > phantompeakqualtools (2018_047_S2_R1_001)
Mar-08 14:11:06.146 [Actor Thread 28] INFO  nextflow.processor.TaskProcessor - [93/93d61f] Cached process > trim_galore (2018_034_S7_R1_001)
Mar-08 14:11:06.153 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [8d/d124dc] Cached process > trim_galore (2018_033_S6_R1_001)
Mar-08 14:11:06.161 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [22/2fcd56] Cached process > trim_galore (2018_031_S4_R1_001)
Mar-08 14:11:06.163 [Actor Thread 12] INFO  nextflow.processor.TaskProcessor - [05/5adc94] Cached process > fastqc (2018_031_S4_R1_001)
Mar-08 14:11:06.165 [Actor Thread 29] INFO  nextflow.processor.TaskProcessor - [71/7e5f1a] Cached process > fastqc (2018_033_S6_R1_001)
Mar-08 14:11:06.169 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [3f/70e8e8] Cached process > fastqc (2018_034_S7_R1_001)
Mar-08 14:11:06.171 [Actor Thread 21] INFO  nextflow.processor.TaskProcessor - [e5/8b71bb] Cached process > fastqc (2018_030_S3_R1_001)
Mar-08 14:11:06.176 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [4e/f01531] Cached process > trim_galore (2018_030_S3_R1_001)
Mar-08 14:11:06.177 [Actor Thread 7] INFO  nextflow.processor.TaskProcessor - [f7/4e0d59] Cached process > trim_galore (2018_028_S1_R1_001)
Mar-08 14:11:06.191 [Actor Thread 28] INFO  nextflow.processor.TaskProcessor - [8a/e6df81] Cached process > trim_galore (2018_035_S8_R1_001)
Mar-08 14:11:06.193 [Actor Thread 6] INFO  nextflow.processor.TaskProcessor - [2e/dd0d55] Cached process > trim_galore (2018_029_S2_R1_001)
Mar-08 14:11:06.195 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [aa/d80ea3] Cached process > bwa (2018_034_S7_R1_001)
Mar-08 14:11:06.202 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [e7/a66c8b] Cached process > fastqc (2018_028_S1_R1_001)
Mar-08 14:11:06.208 [Actor Thread 29] INFO  nextflow.processor.TaskProcessor - [ec/c52c66] Cached process > fastqc (2018_029_S2_R1_001)
Mar-08 14:11:06.215 [Actor Thread 21] INFO  nextflow.processor.TaskProcessor - [2b/3696f9] Cached process > fastqc (2018_032_S5_R1_001)
Mar-08 14:11:06.215 [Actor Thread 24] INFO  nextflow.processor.TaskProcessor - [32/f5f5db] Cached process > bwa (2018_033_S6_R1_001)
Mar-08 14:11:06.219 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [96/286c3e] Cached process > fastqc (2018_035_S8_R1_001)
Mar-08 14:11:06.222 [Actor Thread 7] INFO  nextflow.processor.TaskProcessor - [3b/526d46] Cached process > trim_galore (2018_032_S5_R1_001)
Mar-08 14:11:06.226 [Actor Thread 9] INFO  nextflow.processor.TaskProcessor - [91/63659c] Cached process > bwa (2018_031_S4_R1_001)
Mar-08 14:11:06.231 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [f7/a19598] Cached process > samtools (2018_034_S7_R1_001)
Mar-08 14:11:06.232 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [21/da6718] Cached process > trim_galore (2018_036_S9_R1_001)
Mar-08 14:11:06.236 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [74/d4b269] Cached process > fastqc (2018_036_S9_R1_001)
Mar-08 14:11:06.240 [Actor Thread 13] INFO  nextflow.processor.TaskProcessor - [ce/042e4a] Cached process > samtools (2018_033_S6_R1_001)
Mar-08 14:11:06.242 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [8f/9834d3] Cached process > bwa (2018_030_S3_R1_001)
Mar-08 14:11:06.244 [Actor Thread 6] INFO  nextflow.processor.TaskProcessor - [c7/c8f43d] Cached process > bwa (2018_028_S1_R1_001)
Mar-08 14:11:06.248 [Actor Thread 9] INFO  nextflow.processor.TaskProcessor - [21/12228d] Cached process > bwa (2018_029_S2_R1_001)
Mar-08 14:11:06.254 [Actor Thread 29] INFO  nextflow.processor.TaskProcessor - [d1/43a3a0] Cached process > samtools (2018_031_S4_R1_001)
Mar-08 14:11:06.256 [Actor Thread 24] INFO  nextflow.processor.TaskProcessor - [9f/2a10f4] Cached process > bwa (2018_035_S8_R1_001)
Mar-08 14:11:06.260 [Actor Thread 27] INFO  nextflow.processor.TaskProcessor - [ab/da68d1] Cached process > samtools (2018_030_S3_R1_001)
Mar-08 14:11:06.263 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [b5/9033be] Cached process > samtools (2018_028_S1_R1_001)
Mar-08 14:11:06.322 [Actor Thread 24] INFO  nextflow.processor.TaskProcessor - [59/b9e63c] Cached process > samtools (2018_029_S2_R1_001)
Mar-08 14:11:06.324 [Actor Thread 22] INFO  nextflow.processor.TaskProcessor - [80/a61968] Cached process > fastqc (2018_022_S4_R1_001)
Mar-08 14:11:06.325 [Actor Thread 6] INFO  nextflow.processor.TaskProcessor - [dc/085579] Cached process > bwa (2018_036_S9_R1_001)
Mar-08 14:11:06.329 [Actor Thread 2] INFO  nextflow.processor.TaskProcessor - [bd/158d1a] Cached process > samtools (2018_035_S8_R1_001)
Mar-08 14:11:06.331 [Actor Thread 7] INFO  nextflow.processor.TaskProcessor - [5e/7ff364] Cached process > trim_galore (2018_022_S4_R1_001)
Mar-08 14:11:06.335 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [16/544020] Cached process > fastqc (2018_027_S9_R1_001)
Mar-08 14:11:06.340 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [1c/74c488] Cached process > fastqc (2018_020_S2_R1_001)
Mar-08 14:11:06.352 [Actor Thread 4] INFO  nextflow.processor.TaskProcessor - [7c/626231] Cached process > fastqc (2018_021_S3_R1_001)
Mar-08 14:11:06.353 [Actor Thread 27] INFO  nextflow.processor.TaskProcessor - [03/dbe697] Cached process > trim_galore (2018_027_S9_R1_001)
Mar-08 14:11:06.353 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [c6/72faa0] Cached process > fastqc (2018_026_S8_R1_001)
Mar-08 14:11:06.361 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [79/4347a6] Cached process > bwa (2018_032_S5_R1_001)
Mar-08 14:11:06.364 [Actor Thread 29] INFO  nextflow.processor.TaskProcessor - [57/74eff6] Cached process > trim_galore (2018_021_S3_R1_001)
Mar-08 14:11:06.374 [Actor Thread 22] INFO  nextflow.processor.TaskProcessor - [e5/9cb703] Cached process > fastqc (2018_024_S6_R1_001)
Mar-08 14:11:06.379 [Actor Thread 12] INFO  nextflow.processor.TaskProcessor - [39/996e21] Cached process > trim_galore (2018_020_S2_R1_001)
Mar-08 14:11:06.381 [Actor Thread 14] INFO  nextflow.processor.TaskProcessor - [c4/91eb66] Cached process > samtools (2018_036_S9_R1_001)
Mar-08 14:11:06.387 [Actor Thread 18] INFO  nextflow.processor.TaskProcessor - [68/0b9049] Cached process > fastqc (2018_019_S1_R1_001)
Mar-08 14:11:06.392 [Actor Thread 4] INFO  nextflow.processor.TaskProcessor - [f9/93c44e] Cached process > fastqc (2018_025_S7_R1_001)
Mar-08 14:11:06.393 [Actor Thread 29] INFO  nextflow.processor.TaskProcessor - [29/c5cf62] Cached process > trim_galore (2018_026_S8_R1_001)
Mar-08 14:11:06.395 [Actor Thread 21] INFO  nextflow.processor.TaskProcessor - [c7/8c5c0d] Cached process > trim_galore (2018_019_S1_R1_001)
Mar-08 14:11:06.397 [Actor Thread 7] INFO  nextflow.processor.TaskProcessor - [74/eba99f] Cached process > trim_galore (2018_024_S6_R1_001)
Mar-08 14:11:06.402 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [19/ebd65c] Cached process > fastqc (2018_023_S5_R1_001)
Mar-08 14:11:06.404 [Actor Thread 19] INFO  nextflow.processor.TaskProcessor - [eb/4a477f] Cached process > samtools (2018_032_S5_R1_001)
Mar-08 14:11:06.410 [Actor Thread 12] INFO  nextflow.processor.TaskProcessor - [8d/f7d4b8] Cached process > trim_galore (2018_025_S7_R1_001)
Mar-08 14:11:06.457 [Actor Thread 14] INFO  nextflow.processor.TaskProcessor - [17/ff2689] Cached process > trim_galore (2018_023_S5_R1_001)
Mar-08 14:11:06.766 [Actor Thread 25] DEBUG nextflow.Session - <<< barrier arrive (process: fastqc)
Mar-08 14:11:06.767 [Actor Thread 26] DEBUG nextflow.Session - <<< barrier arrive (process: trim_galore)
Mar-08 14:11:06.904 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_054_S7_R1_001) > jobId: 22044598; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/15/0cc411078dcf0cf0c0985a84d6f9bb
Mar-08 14:11:06.909 [Task submitter] INFO  nextflow.Session - [15/0cc411] Submitted process > bwa (2018_054_S7_R1_001)
Mar-08 14:11:06.955 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [ef/360f85] Cached process > picard (2018_029_S2_R1_001)
Mar-08 14:11:06.958 [Actor Thread 23] INFO  nextflow.processor.TaskProcessor - [53/86b261] Cached process > picard (2018_030_S3_R1_001)
Mar-08 14:11:06.961 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [b9/ec4f54] Cached process > picard (2018_028_S1_R1_001)
Mar-08 14:11:06.972 [Actor Thread 6] INFO  nextflow.processor.TaskProcessor - [b7/c443e6] Cached process > picard (2018_035_S8_R1_001)
Mar-08 14:11:06.990 [Actor Thread 11] INFO  nextflow.processor.TaskProcessor - [f0/c85660] Cached process > phantompeakqualtools (2018_028_S1_R1_001)
Mar-08 14:11:06.992 [Actor Thread 10] INFO  nextflow.processor.TaskProcessor - [5f/524c7d] Cached process > phantompeakqualtools (2018_035_S8_R1_001)
Mar-08 14:11:07.003 [Actor Thread 5] INFO  nextflow.processor.TaskProcessor - [bb/94113a] Cached process > phantompeakqualtools (2018_030_S3_R1_001)
Mar-08 14:11:07.006 [Actor Thread 14] INFO  nextflow.processor.TaskProcessor - [8b/9625c6] Cached process > phantompeakqualtools (2018_029_S2_R1_001)
Mar-08 14:11:07.006 [Actor Thread 23] INFO  nextflow.processor.TaskProcessor - [9c/b8c6e4] Cached process > picard (2018_032_S5_R1_001)
Mar-08 14:11:07.007 [Actor Thread 20] INFO  nextflow.processor.TaskProcessor - [6e/58617c] Cached process > picard (2018_036_S9_R1_001)
Mar-08 14:11:07.027 [Actor Thread 3] INFO  nextflow.processor.TaskProcessor - [c0/df5def] Cached process > phantompeakqualtools (2018_036_S9_R1_001)
Mar-08 14:11:08.229 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process samtools (2018_050_S4_R1_001) > jobId: 22044599; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/22/fa7955bdc319ba4bf1307394d87f73
Mar-08 14:11:08.230 [Task submitter] INFO  nextflow.Session - [22/fa7955] Submitted process > samtools (2018_050_S4_R1_001)
Mar-08 14:11:09.469 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process phantompeakqualtools (2018_053_S8_R1_001) > jobId: 22044600; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/07/a38985812ecf31371b3838f030935a
Mar-08 14:11:09.470 [Task submitter] INFO  nextflow.Session - [07/a38985] Submitted process > phantompeakqualtools (2018_053_S8_R1_001)
Mar-08 14:11:10.713 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process picard (2018_052_S6_R1_001) > jobId: 22044601; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/ac/cdaaaa677d11be595ceac321c644c4
Mar-08 14:11:10.714 [Task submitter] INFO  nextflow.Session - [ac/cdaaaa] Submitted process > picard (2018_052_S6_R1_001)
Mar-08 14:11:11.943 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process picard (2018_033_S6_R1_001) > jobId: 22044602; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/18/554371fd61317370002512c27ab9d8
Mar-08 14:11:11.946 [Task submitter] INFO  nextflow.Session - [18/554371] Submitted process > picard (2018_033_S6_R1_001)
Mar-08 14:11:13.195 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process picard (2018_034_S7_R1_001) > jobId: 22044603; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/57/a3c509e0f1899c19ad92cd990f83c7
Mar-08 14:11:13.195 [Task submitter] INFO  nextflow.Session - [57/a3c509] Submitted process > picard (2018_034_S7_R1_001)
Mar-08 14:11:14.448 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process picard (2018_031_S4_R1_001) > jobId: 22044604; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/8f/50424587022232cf922a36aa7cec34
Mar-08 14:11:14.449 [Task submitter] INFO  nextflow.Session - [8f/504245] Submitted process > picard (2018_031_S4_R1_001)
Mar-08 14:11:15.654 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_022_S4_R1_001) > jobId: 22044605; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/c8/d1403871bcb328664fec9141220ece
Mar-08 14:11:15.655 [Task submitter] INFO  nextflow.Session - [c8/d14038] Submitted process > bwa (2018_022_S4_R1_001)
Mar-08 14:11:16.872 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_020_S2_R1_001) > jobId: 22044606; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/bc/e7435f1150891e42342a97ebd4a305
Mar-08 14:11:16.872 [Task submitter] INFO  nextflow.Session - [bc/e7435f] Submitted process > bwa (2018_020_S2_R1_001)
Mar-08 14:11:18.076 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_027_S9_R1_001) > jobId: 22044608; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/b1/167309bdfd8f454b77c6d397af2bc4
Mar-08 14:11:18.077 [Task submitter] INFO  nextflow.Session - [b1/167309] Submitted process > bwa (2018_027_S9_R1_001)
Mar-08 14:11:19.289 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_021_S3_R1_001) > jobId: 22044609; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/92/435e2b898816628e4344e1815b5a3b
Mar-08 14:11:19.289 [Task submitter] INFO  nextflow.Session - [92/435e2b] Submitted process > bwa (2018_021_S3_R1_001)
Mar-08 14:11:20.996 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_024_S6_R1_001) > jobId: 22044610; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/e1/7d25e61ab6f7ce4fa1f8a3239672fd
Mar-08 14:11:20.997 [Task submitter] INFO  nextflow.Session - [e1/7d25e6] Submitted process > bwa (2018_024_S6_R1_001)
Mar-08 14:11:23.479 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_025_S7_R1_001) > jobId: 22044612; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/89/ad8c77210e80d43e718ddb5e68c1d8
Mar-08 14:11:23.480 [Task submitter] INFO  nextflow.Session - [89/ad8c77] Submitted process > bwa (2018_025_S7_R1_001)
Mar-08 14:11:24.658 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_019_S1_R1_001) > jobId: 22044614; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/a9/3ffffaed3cff37f05e55559db768b3
Mar-08 14:11:24.659 [Task submitter] INFO  nextflow.Session - [a9/3ffffa] Submitted process > bwa (2018_019_S1_R1_001)
Mar-08 14:11:25.862 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_026_S8_R1_001) > jobId: 22044616; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/02/a1f68fbfa349f56abd82cd75815d98
Mar-08 14:11:25.863 [Task submitter] INFO  nextflow.Session - [02/a1f68f] Submitted process > bwa (2018_026_S8_R1_001)
Mar-08 14:11:27.079 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process phantompeakqualtools (2018_032_S5_R1_001) > jobId: 22044617; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/8b/0a8e26a3582b7c4c23e60c267070da
Mar-08 14:11:27.080 [Task submitter] INFO  nextflow.Session - [8b/0a8e26] Submitted process > phantompeakqualtools (2018_032_S5_R1_001)
Mar-08 14:11:28.349 [Task submitter] DEBUG nextflow.executor.GridTaskHandler - [SLURM] submitted process bwa (2018_023_S5_R1_001) > jobId: 22044618; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/15/1d51d500b4f6c3d1bd279b7f61b9d7
Mar-08 14:11:28.350 [Task submitter] INFO  nextflow.Session - [15/1d51d5] Submitted process > bwa (2018_023_S5_R1_001)
Mar-08 14:16:09.743 [Task monitor] DEBUG n.processor.TaskPollingMonitor - !! executor slurm > tasks to be completed: 17 -- pending tasks are shown below
~> TaskHandler[jobId: 22044598; id: 9; name: bwa (2018_054_S7_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/15/0cc411078dcf0cf0c0985a84d6f9bb started: 1552072384737; exited: -; ]
~> TaskHandler[jobId: 22044599; id: 66; name: samtools (2018_050_S4_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/22/fa7955bdc319ba4bf1307394d87f73 started: 1552072386707; exited: -; ]
~> TaskHandler[jobId: 22044600; id: 95; name: phantompeakqualtools (2018_053_S8_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/07/a38985812ecf31371b3838f030935a started: 1552072386715; exited: -; ]
~> TaskHandler[jobId: 22044601; id: 97; name: picard (2018_052_S6_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/ac/cdaaaa677d11be595ceac321c644c4 started: 1552072386734; exited: -; ]
~> TaskHandler[jobId: 22044602; id: 132; name: picard (2018_033_S6_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/18/554371fd61317370002512c27ab9d8 started: 1552072386739; exited: -; ]
~> TaskHandler[jobId: 22044603; id: 131; name: picard (2018_034_S7_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/57/a3c509e0f1899c19ad92cd990f83c7 started: 1552072386746; exited: -; ]
~> TaskHandler[jobId: 22044604; id: 137; name: picard (2018_031_S4_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/8f/50424587022232cf922a36aa7cec34 started: 1552072386752; exited: -; ]
~> TaskHandler[jobId: 22044605; id: 150; name: bwa (2018_022_S4_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/c8/d1403871bcb328664fec9141220ece started: 1552072386758; exited: -; ]
~> TaskHandler[jobId: 22044606; id: 161; name: bwa (2018_020_S2_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/bc/e7435f1150891e42342a97ebd4a305 started: 1552072386763; exited: -; ]
~> TaskHandler[jobId: 22044608; id: 157; name: bwa (2018_027_S9_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/b1/167309bdfd8f454b77c6d397af2bc4 started: 1552072386771; exited: -; ]
.. remaining tasks omitted.
Mar-08 14:21:09.771 [Task monitor] DEBUG n.processor.TaskPollingMonitor - !! executor slurm > tasks to be completed: 17 -- pending tasks are shown below
~> TaskHandler[jobId: 22044598; id: 9; name: bwa (2018_054_S7_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/15/0cc411078dcf0cf0c0985a84d6f9bb started: 1552072384737; exited: -; ]
~> TaskHandler[jobId: 22044599; id: 66; name: samtools (2018_050_S4_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/22/fa7955bdc319ba4bf1307394d87f73 started: 1552072386707; exited: -; ]
~> TaskHandler[jobId: 22044600; id: 95; name: phantompeakqualtools (2018_053_S8_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/07/a38985812ecf31371b3838f030935a started: 1552072386715; exited: -; ]
~> TaskHandler[jobId: 22044601; id: 97; name: picard (2018_052_S6_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/ac/cdaaaa677d11be595ceac321c644c4 started: 1552072386734; exited: -; ]
~> TaskHandler[jobId: 22044602; id: 132; name: picard (2018_033_S6_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/18/554371fd61317370002512c27ab9d8 started: 1552072386739; exited: -; ]
~> TaskHandler[jobId: 22044603; id: 131; name: picard (2018_034_S7_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/57/a3c509e0f1899c19ad92cd990f83c7 started: 1552072386746; exited: -; ]
~> TaskHandler[jobId: 22044604; id: 137; name: picard (2018_031_S4_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/8f/50424587022232cf922a36aa7cec34 started: 1552072386752; exited: -; ]
~> TaskHandler[jobId: 22044605; id: 150; name: bwa (2018_022_S4_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/c8/d1403871bcb328664fec9141220ece started: 1552072386758; exited: -; ]
~> TaskHandler[jobId: 22044606; id: 161; name: bwa (2018_020_S2_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/bc/e7435f1150891e42342a97ebd4a305 started: 1552072386763; exited: -; ]
~> TaskHandler[jobId: 22044608; id: 157; name: bwa (2018_027_S9_R1_001); status: RUNNING; exit: -; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/b1/167309bdfd8f454b77c6d397af2bc4 started: 1552072386771; exited: -; ]
.. remaining tasks omitted.
Mar-08 14:24:15.665 [Task monitor] DEBUG n.processor.TaskPollingMonitor - Task completed > TaskHandler[jobId: 22044617; id: 178; name: phantompeakqualtools (2018_032_S5_R1_001); status: COMPLETED; exit: 137; error: -; workDir: /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/8b/0a8e26a3582b7c4c23e60c267070da started: 1552072386811; exited: 2019-03-08T19:24:11.087069Z; ]
Mar-08 14:24:15.716 [Task monitor] ERROR nextflow.processor.TaskProcessor - Error executing process > 'phantompeakqualtools (2018_032_S5_R1_001)'

Caused by:
  Process `phantompeakqualtools (2018_032_S5_R1_001)` terminated with an error exit status (137)

Command executed:

  run_spp.r -c="2018_032_S5_R1_001.dedup.sorted.bam" -savp -out="2018_032_S5_R1_001.spp.out"

Command exit status:
  137

Command output:
  ################
  ChIP data: 2018_032_S5_R1_001.dedup.sorted.bam 
  Control data: NA 
  strandshift(min): -500 
  strandshift(step): 5 
  strandshift(max) 1500 
  user-defined peak shift NA 
  exclusion(min): 10 
  exclusion(max): NaN 
  num parallel nodes: NA 
  FDR threshold: 0.01 
  NumPeaks Threshold: NA 
  Output Directory: . 
  narrowPeak output file name: NA 
  regionPeak output file name: NA 
  Rdata filename: NA 
  plot pdf filename: ./2018_032_S5_R1_001.dedup.sorted.pdf 
  result filename: 2018_032_S5_R1_001.spp.out 
  Overwrite files?: FALSE
  
  [1] TRUE
  Reading ChIP tagAlign/BAM file 2018_032_S5_R1_001.dedup.sorted.bam 
  opened /tmp/Rtmp2eTnWZ/2018_032_S5_R1_001.dedup.sorted.tagAlign1a0037df6a45
  done. read 37330015 fragments
  ChIP data read length 76 
  [1] TRUE
  Calculating peak characteristics

Command error:
  Loading required package: Rcpp
  .command.sh: line 2:  6656 Killed                  run_spp.r -c="2018_032_S5_R1_001.dedup.sorted.bam" -savp -out="2018_032_S5_R1_001.spp.out"

Work dir:
  /gpfs/gsfs11/users/capaldobj/CS02314X-ChIP-seq-results/work/8b/0a8e26a3582b7c4c23e60c267070da

Tip: you can replicate the issue by changing to the process work dir and entering the command `bash .command.run`
Mar-08 14:24:15.751 [Task monitor] DEBUG nextflow.Session - Session aborted -- Cause: Process `phantompeakqualtools (2018_032_S5_R1_001)` terminated with an error exit status (137)
Mar-08 14:24:15.825 [Task monitor] DEBUG nextflow.Session - The following nodes are still active:
[process] samtools
  status=ACTIVE
  port 0: (queue) OPEN; channel: bam
  port 1: (cntrl) OPEN; channel: $

[process] bwa_mapped
  status=ACTIVE
  port 0: (value) -   ; channel: input_files
  port 1: (value) -   ; channel: bai
  port 2: (cntrl) OPEN; channel: $

[process] picard
  status=ACTIVE
  port 0: (queue) OPEN; channel: bam
  port 1: (cntrl) OPEN; channel: $

[process] countstat
  status=ACTIVE
  port 0: (value) -   ; channel: input
  port 1: (cntrl) OPEN; channel: $

[process] phantompeakqualtools
  status=ACTIVE
  port 0: (queue) OPEN; channel: bam
  port 1: (cntrl) OPEN; channel: $

[process] calculateNSCRSC
  status=ACTIVE
  port 0: (value) -   ; channel: spp_out_list
  port 1: (cntrl) OPEN; channel: $

[process] deepTools
  status=ACTIVE
  port 0: (value) -   ; channel: bam
  port 1: (value) -   ; channel: bai
  port 2: (cntrl) OPEN; channel: $

[process] ngsplot
  status=ACTIVE
  port 0: (value) -   ; channel: input_bam_files
  port 1: (value) -   ; channel: input_bai_files
  port 2: (cntrl) OPEN; channel: $

[process] macs
  status=ACTIVE
  port 0: (value) -   ; channel: bam_for_macs
  port 1: (value) -   ; channel: bai_for_macs
  port 2: (queue) OPEN; channel: -
  port 3: (cntrl) OPEN; channel: $

[process] saturation
  status=ACTIVE
  port 0: (value) -   ; channel: bam_for_saturation
  port 1: (value) -   ; channel: bai_for_saturation
  port 2: (queue) OPEN; channel: -
  port 3: (value) -   ; channel: __$eachinparam<3>
  port 4: (cntrl) OPEN; channel: $

[process] saturation_r
  status=ACTIVE
  port 0: (value) -   ; channel: macsconfig
  port 1: (value) -   ; channel: countstat
  port 2: (value) -   ; channel: saturation_results_collection
  port 3: (cntrl) OPEN; channel: $

[process] chippeakanno
  status=ACTIVE
  port 0: (value) -   ; channel: macs_peaks_collection
  port 1: (value) -   ; channel: gtf
  port 2: (cntrl) OPEN; channel: $

[process] multiqc
  status=ACTIVE
  port 0: (value) -   ; channel: multiqc_config
  port 1: (value) -   ; channel: fastqc
  port 2: (value) -   ; channel: trimgalore/*
  port 3: (value) -   ; channel: samtools/*
  port 4: (value) -   ; channel: picard/*
  port 5: (cntrl) OPEN; channel: $

[process] output_documentation
  status=ACTIVE
  port 0: (value) -   ; channel: prefix
  port 1: (value) -   ; channel: output
  port 2: (cntrl) OPEN; channel: $

Mar-08 14:24:15.832 [main] DEBUG nextflow.Session - Session await > all process finished
Mar-08 14:24:15.833 [main] DEBUG nextflow.Session - Session await > all barriers passed
Mar-08 14:24:16.151 [Task monitor] DEBUG n.processor.TaskPollingMonitor - <<< barrier arrives (monitor: slurm)
Mar-08 14:24:16.255 [main] INFO  nextflow.Nextflow - [nf-core/chipseq] Pipeline Complete
Mar-08 14:24:16.269 [main] WARN  n.processor.TaskPollingMonitor - Killing pending tasks (16)
Mar-08 14:24:16.381 [main] DEBUG nextflow.trace.StatsObserver - Workflow completed > WorkflowStats[succeedCount=0; failedCount=17; ignoredCount=0; cachedCount=163; succeedDuration=0ms; failedDuration=11m 2s; cachedDuration=5d 15h 26m 38s]
Mar-08 14:24:16.381 [main] DEBUG nextflow.trace.TraceFileObserver - Flow completing -- flushing trace file
Mar-08 14:24:16.398 [main] DEBUG nextflow.trace.ReportObserver - Flow completing -- rendering html report
Mar-08 14:24:16.590 [main] DEBUG nextflow.trace.ReportObserver - Execution report summary data:
  {"phantompeakqualtools":{"cpu":{"mean":4338.2,"min":1090.5,"q1":2266.67,"q2":3313,"q3":6796.1,"max":9121,"minLabel":"phantompeakqualtools (2018_039_S3_R1_001)","maxLabel":"phantompeakqualtools (2018_032_S5_R1_001)","q1Label":"phantompeakqualtools (2018_043_S6_R1_001)","q2Label":"phantompeakqualtools (2018_040_S4_R1_001)","q3Label":"phantompeakqualtools (2018_029_S2_R1_001)"},"mem":{"mean":1573037260.8,"min":749498368,"q1":1069484032,"q2":1346160640,"q3":2398029824,"max":2488328192,"minLabel":"phantompeakqualtools (2018_039_S3_R1_001)","maxLabel":"phantompeakqualtools (2018_028_S1_R1_001)","q1Label":"phantompeakqualtools (2018_037_S1_R1_001)","q2Label":"phantompeakqualtools (2018_040_S4_R1_001)","q3Label":"phantompeakqualtools (2018_036_S9_R1_001)"},"time":{"mean":1069263.25,"min":265963,"q1":580832,"q2":715079,"q3":1453738.25,"max":2680287,"minLabel":"phantompeakqualtools (2018_039_S3_R1_001)","maxLabel":"phantompeakqualtools (2018_036_S9_R1_001)","q1Label":"phantompeakqualtools (2018_043_S6_R1_001)","q2Label":"phantompeakqualtools (2018_040_S4_R1_001)","q3Label":"phantompeakqualtools (2018_047_S2_R1_001)"},"reads":{"mean":774731776,"min":0,"q1":0,"q2":1986560,"q3":1534476288,"max":3274293248,"minLabel":"phantompeakqualtools (2018_038_S2_R1_001)","maxLabel":"phantompeakqualtools (2018_032_S5_R1_001)","q1Label":"phantompeakqualtools (2018_048_S9_R1_001)","q2Label":"phantompeakqualtools (2018_037_S1_R1_001)","q3Label":"phantompeakqualtools (2018_028_S1_R1_001)"},"writes":{"mean":1718302720,"min":439996416,"q1":833373184,"q2":1312448512,"q3":2894243840,"max":3299008512,"minLabel":"phantompeakqualtools (2018_039_S3_R1_001)","maxLabel":"phantompeakqualtools (2018_030_S3_R1_001)","q1Label":"phantompeakqualtools (2018_037_S1_R1_001)","q2Label":"phantompeakqualtools (2018_040_S4_R1_001)","q3Label":"phantompeakqualtools (2018_029_S2_R1_001)"},"cpuUsage":{"mean":4338.2,"min":1090.5,"q1":2266.67,"q2":3313,"q3":6796.1,"max":9121,"minLabel":"phantompeakqualtools (2018_039_S3_R1_001)","maxLabel":"phantompeakqualtools (2018_032_S5_R1_001)","q1Label":"phantompeakqualtools (2018_043_S6_R1_001)","q2Label":"phantompeakqualtools (2018_040_S4_R1_001)","q3Label":"phantompeakqualtools (2018_029_S2_R1_001)"},"memUsage":null,"timeUsage":null},"fastqc":{"cpu":{"mean":166.56,"min":129.7,"q1":147.38,"q2":155.7,"q3":173.13,"max":313.9,"minLabel":"fastqc (2018_051_S5_R1_001)","maxLabel":"fastqc (2018_046_S1_R1_001)","q1Label":"fastqc (2018_034_S7_R1_001)","q2Label":"fastqc (2018_024_S6_R1_001)","q3Label":"fastqc (2018_042_S9_R1_001)"},"mem":{"mean":2796563228.44,"min":2720681984,"q1":2798485504,"q2":2798485504,"q3":2802704384,"max":2802704384,"minLabel":"fastqc (2018_044_S7_R1_001)","maxLabel":"fastqc (2018_025_S7_R1_001)","q1Label":"fastqc (2018_038_S2_R1_001)","q2Label":"fastqc (2018_036_S9_R1_001)","q3Label":"fastqc (2018_046_S1_R1_001)"},"time":{"mean":528830.06,"min":131561,"q1":346906,"q2":465431.5,"q3":677793.25,"max":1260941,"minLabel":"fastqc (2018_042_S9_R1_001)","maxLabel":"fastqc (2018_024_S6_R1_001)","q1Label":"fastqc (2018_039_S3_R1_001)","q2Label":"fastqc (2018_049_S3_R1_001)","q3Label":"fastqc (2018_048_S9_R1_001)"},"reads":{"mean":22528,"min":0,"q1":0,"q2":0,"q3":0,"max":253952,"minLabel":"fastqc (2018_054_S7_R1_001)","maxLabel":"fastqc (2018_048_S9_R1_001)","q1Label":"fastqc (2018_050_S4_R1_001)","q2Label":"fastqc (2018_028_S1_R1_001)","q3Label":"fastqc (2018_025_S7_R1_001)"},"writes":{"mean":712248.89,"min":237568,"q1":596992,"q2":686080,"q3":791552,"max":1232896,"minLabel":"fastqc (2018_044_S7_R1_001)","maxLabel":"fastqc (2018_020_S2_R1_001)","q1Label":"fastqc (2018_022_S4_R1_001)","q2Label":"fastqc (2018_053_S8_R1_001)","q3Label":"fastqc (2018_039_S3_R1_001)"},"cpuUsage":{"mean":166.56,"min":129.7,"q1":147.38,"q2":155.7,"q3":173.13,"max":313.9,"minLabel":"fastqc (2018_051_S5_R1_001)","maxLabel":"fastqc (2018_046_S1_R1_001)","q1Label":"fastqc (2018_034_S7_R1_001)","q2Label":"fastqc (2018_024_S6_R1_001)","q3Label":"fastqc (2018_042_S9_R1_001)"},"memUsage":null,"timeUsage":null},"bwa":{"cpu":{"mean":289.15,"min":258.5,"q1":284.13,"q2":289.55,"q3":298.6,"max":303.4,"minLabel":"bwa (2018_037_S1_R1_001)","maxLabel":"bwa (2018_036_S9_R1_001)","q1Label":"bwa (2018_041_S5_R1_001)","q2Label":"bwa (2018_053_S8_R1_001)","q3Label":"bwa (2018_040_S4_R1_001)"},"mem":{"mean":5981120827.08,"min":5913477120,"q1":5959307264,"q2":5984344064,"q3":6026416128,"max":6026420224,"minLabel":"bwa (2018_042_S9_R1_001)","maxLabel":"bwa (2018_032_S5_R1_001)","q1Label":"bwa (2018_046_S1_R1_001)","q2Label":"bwa (2018_044_S7_R1_001)","q3Label":"bwa (2018_033_S6_R1_001)"},"time":{"mean":10968067.62,"min":3412936,"q1":5003361.75,"q2":12097131.5,"q3":15049303.75,"max":17984680,"minLabel":"bwa (2018_037_S1_R1_001)","maxLabel":"bwa (2018_050_S4_R1_001)","q1Label":"bwa (2018_038_S2_R1_001)","q2Label":"bwa (2018_030_S3_R1_001)","q3Label":"bwa (2018_048_S9_R1_001)"},"reads":{"mean":1703030153.85,"min":0,"q1":0,"q2":1168095232,"q3":2455827456,"max":5421043712,"minLabel":"bwa (2018_050_S4_R1_001)","maxLabel":"bwa (2018_046_S1_R1_001)","q1Label":"bwa (2018_035_S8_R1_001)","q2Label":"bwa (2018_040_S4_R1_001)","q3Label":"bwa (2018_053_S8_R1_001)"},"writes":null,"cpuUsage":{"mean":289.15,"min":258.5,"q1":284.13,"q2":289.55,"q3":298.6,"max":303.4,"minLabel":"bwa (2018_037_S1_R1_001)","maxLabel":"bwa (2018_036_S9_R1_001)","q1Label":"bwa (2018_041_S5_R1_001)","q2Label":"bwa (2018_053_S8_R1_001)","q3Label":"bwa (2018_040_S4_R1_001)"},"memUsage":null,"timeUsage":null},"picard":{"cpu":{"mean":10135.21,"min":4592.1,"q1":5701.5,"q2":8484.2,"q3":14484.2,"max":19126,"minLabel":"picard (2018_038_S2_R1_001)","maxLabel":"picard (2018_032_S5_R1_001)","q1Label":"picard (2018_040_S4_R1_001)","q2Label":"picard (2018_045_S8_R1_001)","q3Label":"picard (2018_047_S2_R1_001)"},"mem":{"mean":14146164053.33,"min":14145347584,"q1":14145347584,"q2":14145347584,"q3":14147796992,"max":14147796992,"minLabel":"picard (2018_042_S9_R1_001)","maxLabel":"picard (2018_036_S9_R1_001)","q1Label":"picard (2018_043_S6_R1_001)","q2Label":"picard (2018_039_S3_R1_001)","q3Label":"picard (2018_029_S2_R1_001)"},"time":{"mean":1173738.38,"min":343956,"q1":583586,"q2":870862,"q3":1849688,"max":2473377,"minLabel":"picard (2018_039_S3_R1_001)","maxLabel":"picard (2018_030_S3_R1_001)","q1Label":"picard (2018_043_S6_R1_001)","q2Label":"picard (2018_045_S8_R1_001)","q3Label":"picard (2018_029_S2_R1_001)"},"reads":{"mean":3955370.67,"min":0,"q1":0,"q2":114688,"q3":9592832,"max":13942784,"minLabel":"picard (2018_042_S9_R1_001)","maxLabel":"picard (2018_041_S5_R1_001)","q1Label":"picard (2018_044_S7_R1_001)","q2Label":"picard (2018_053_S8_R1_001)","q3Label":"picard (2018_029_S2_R1_001)"},"writes":{"mean":3644983198.48,"min":658079744,"q1":1397493760,"q2":2149326848,"q3":6666387456,"max":8624631808,"minLabel":"picard (2018_039_S3_R1_001)","maxLabel":"picard (2018_030_S3_R1_001)","q1Label":"picard (2018_043_S6_R1_001)","q2Label":"picard (2018_049_S3_R1_001)","q3Label":"picard (2018_029_S2_R1_001)"},"cpuUsage":{"mean":10135.21,"min":4592.1,"q1":5701.5,"q2":8484.2,"q3":14484.2,"max":19126,"minLabel":"picard (2018_038_S2_R1_001)","maxLabel":"picard (2018_032_S5_R1_001)","q1Label":"picard (2018_040_S4_R1_001)","q2Label":"picard (2018_045_S8_R1_001)","q3Label":"picard (2018_047_S2_R1_001)"},"memUsage":null,"timeUsage":null},"samtools":{"cpu":{"mean":12803.65,"min":3095.1,"q1":8623.3,"q2":14986.2,"q3":16612.1,"max":21462.6,"minLabel":"samtools (2018_038_S2_R1_001)","maxLabel":"samtools (2018_033_S6_R1_001)","q1Label":"samtools (2018_046_S1_R1_001)","q2Label":"samtools (2018_035_S8_R1_001)","q3Label":"samtools (2018_034_S7_R1_001)"},"mem":{"mean":1116490956.8,"min":1093816320,"q1":1094643712,"q2":1096290304,"q3":1140183040,"max":1140293632,"minLabel":"samtools (2018_038_S2_R1_001)","maxLabel":"samtools (2018_042_S9_R1_001)","q1Label":"samtools (2018_049_S3_R1_001)","q2Label":"samtools (2018_033_S6_R1_001)","q3Label":"samtools (2018_029_S2_R1_001)"},"time":{"mean":1313877.88,"min":376606,"q1":651645,"q2":1635513,"q3":1826202,"max":2492465,"minLabel":"samtools (2018_038_S2_R1_001)","maxLabel":"samtools (2018_033_S6_R1_001)","q1Label":"samtools (2018_044_S7_R1_001)","q2Label":"samtools (2018_052_S6_R1_001)","q3Label":"samtools (2018_036_S9_R1_001)"},"reads":{"mean":559841.28,"min":0,"q1":0,"q2":0,"q3":0,"max":6692864,"minLabel":"samtools (2018_042_S9_R1_001)","maxLabel":"samtools (2018_046_S1_R1_001)","q1Label":"samtools (2018_041_S5_R1_001)","q2Label":"samtools (2018_031_S4_R1_001)","q3Label":"samtools (2018_032_S5_R1_001)"},"writes":{"mean":5336826183.68,"min":1203060736,"q1":2080759808,"q2":6091927552,"q3":7916625920,"max":10131423232,"minLabel":"samtools (2018_038_S2_R1_001)","maxLabel":"samtools (2018_033_S6_R1_001)","q1Label":"samtools (2018_040_S4_R1_001)","q2Label":"samtools (2018_053_S8_R1_001)","q3Label":"samtools (2018_031_S4_R1_001)"},"cpuUsage":{"mean":12803.65,"min":3095.1,"q1":8623.3,"q2":14986.2,"q3":16612.1,"max":21462.6,"minLabel":"samtools (2018_038_S2_R1_001)","maxLabel":"samtools (2018_033_S6_R1_001)","q1Label":"samtools (2018_046_S1_R1_001)","q2Label":"samtools (2018_035_S8_R1_001)","q3Label":"samtools (2018_034_S7_R1_001)"},"memUsage":null,"timeUsage":null},"trim_galore":{"cpu":{"mean":69418.63,"min":232.7,"q1":33089.3,"q2":64586.8,"q3":85154,"max":143488.1,"minLabel":"trim_galore (2018_023_S5_R1_001)","maxLabel":"trim_galore (2018_025_S7_R1_001)","q1Label":"trim_galore (2018_042_S9_R1_001)","q2Label":"trim_galore (2018_034_S7_R1_001)","q3Label":"trim_galore (2018_054_S7_R1_001)"},"mem":{"mean":2918290090.67,"min":2860482560,"q1":2911741952,"q2":2938286080,"q3":2938286080,"max":2942504960,"minLabel":"trim_galore (2018_040_S4_R1_001)","maxLabel":"trim_galore (2018_024_S6_R1_001)","q1Label":"trim_galore (2018_032_S5_R1_001)","q2Label":"trim_galore (2018_037_S1_R1_001)","q3Label":"trim_galore (2018_019_S1_R1_001)"},"time":{"mean":2921424.61,"min":947308,"q1":2066396.75,"q2":2594724,"q3":3789641.5,"max":5637355,"minLabel":"trim_galore (2018_043_S6_R1_001)","maxLabel":"trim_galore (2018_025_S7_R1_001)","q1Label":"trim_galore (2018_039_S3_R1_001)","q2Label":"trim_galore (2018_036_S9_R1_001)","q3Label":"trim_galore (2018_054_S7_R1_001)"},"reads":{"mean":731363.56,"min":0,"q1":0,"q2":0,"q3":964608,"max":5824512,"minLabel":"trim_galore (2018_050_S4_R1_001)","maxLabel":"trim_galore (2018_031_S4_R1_001)","q1Label":"trim_galore (2018_035_S8_R1_001)","q2Label":"trim_galore (2018_019_S1_R1_001)","q3Label":"trim_galore (2018_037_S1_R1_001)"},"writes":{"mean":524401.78,"min":135168,"q1":318464,"q2":495616,"q3":722944,"max":1032192,"minLabel":"trim_galore (2018_040_S4_R1_001)","maxLabel":"trim_galore (2018_024_S6_R1_001)","q1Label":"trim_galore (2018_036_S9_R1_001)","q2Label":"trim_galore (2018_031_S4_R1_001)","q3Label":"trim_galore (2018_019_S1_R1_001)"},"cpuUsage":{"mean":69418.63,"min":232.7,"q1":33089.3,"q2":64586.8,"q3":85154,"max":143488.1,"minLabel":"trim_galore (2018_023_S5_R1_001)","maxLabel":"trim_galore (2018_025_S7_R1_001)","q1Label":"trim_galore (2018_042_S9_R1_001)","q2Label":"trim_galore (2018_034_S7_R1_001)","q3Label":"trim_galore (2018_054_S7_R1_001)"},"memUsage":null,"timeUsage":null}}
Mar-08 14:24:17.433 [main] DEBUG nextflow.trace.TimelineObserver - Flow completing -- rendering html timeline
Mar-08 14:24:17.564 [main] DEBUG nextflow.CacheDB - Closing CacheDB done
Mar-08 14:24:17.615 [main] DEBUG nextflow.script.ScriptRunner - > Execution complete -- Goodbye

what dose the errror mean for check design.csv

error message

Command wrapper:
  ERROR: Please check the design file: Replicate IDs must start with 1..<num_replicates>
  Group: IMR90-prol-5mC-2, Replicate IDs: [2]

and here is my design.csv, i am not sure why was the error thrown.

group,replicate,fastq_1,fastq_2,antibody,control
IMR90-prol-5mC-1,1,AS-355559-LR-44293_R1.fastq.gz,,5mC,IMR90-prol-input
IMR90-prol-5mC-2,2,AS-355574-LR-44294_R1.fastq.gz,,5mC,IMR90-prol-input
IMR90-prol-5mC-3,3,AS-355589-LR-44295_R1.fastq.gz,,5mC,IMR90-prol-input
IMR90-prol-6mA-1,1,AS-355561-LR-44293_R1.fastq.gz,,6mA,IMR90-prol-input
IMR90-prol-6mA-2,2,AS-355576-LR-44294_R1.fastq.gz,,6mA,IMR90-prol-input
IMR90-prol-6mA-3,3,AS-355591-LR-44295_R1.fastq.gz,,6mA,IMR90-prol-input
IMR90-prol-input,1,AS-355536-LR-44292_R1.fastq.gz,,,

any suggestion?
thanks !

plotFingerprint should only run on the bams that are used in macs process

Ideally, plotFingerprint is run on the IP and input bams that are used in the macs process as opposed to all of them. This will take some rejigging of the code though...

chipseq/main.nf

Lines 871 to 878 in b4b75ae

if(!params.skipDupRemoval){
chip = "-t ${chip_sample_id}.dedup.sorted.bam"
ctrl = ctrl_sample_id == '' ? '' : "-c ${ctrl_sample_id}.dedup.sorted.bam"
}
else {
chip = "-t ${chip_sample_id}.sorted.bam"
ctrl = ctrl_sample_id == '' ? '' : "-c ${ctrl_sample_id}.sorted.bam"
}

Originally posted by @drpatelh in #71 (comment)

Reduce size of container

The container for this pipeline is kind of huge. It would be great if we could reduce the number of conda packages somehow and trim it down a bit. I'm pretty sure that there's a lot of stuff there that's not being used.

Picard doesn't work with modules on rackham

Running the picard step

picard MarkDuplicates \
    INPUT=ENCFF341BPJ_chr22.sorted.bam \
    OUTPUT=ENCFF341BPJ_chr22.dedup.bam \
    ASSUME_SORTED=true \
    REMOVE_DUPLICATES=true \
    METRICS_FILE=ENCFF341BPJ_chr22.picardDupMetrics.txt \
    VALIDATION_STRINGENCY=LENIENT \
    PROGRAM_RECORD_ID='null'

gives picard: command not found error. (All modules picard/2.10.3, samtools/1.5 and BEDTools/2.26.0 seem to load ok). Changing the picard call to this command works:

 java -jar $PICARD_HOME/picard.jar MarkDuplicates \
    INPUT=ENCFF341BPJ_chr22.sorted.bam \
    OUTPUT=ENCFF341BPJ_chr22.dedup.bam \
    ASSUME_SORTED=true \
    REMOVE_DUPLICATES=true \
    METRICS_FILE=ENCFF341BPJ_chr22.picardDupMetrics.txt \
    VALIDATION_STRINGENCY=LENIENT \
    PROGRAM_RECORD_ID='null'

Not sure if we will have the same problem when running from the Docker container..

Time out error running BWA

I getting an error running BWA: The job seems to finish ok, with output files being written och log files indicating no error. But somehow the job is not recognized as done, so it keep on running until it's killed due to the time limit.

This was run on Uppmax/bianca, using singularity and nextflow v 0.30.1 and v0.31.0.

The last lines in .command.log

main] Version: 0.7.17-r1188
[main] CMD: bwa mem -M BWAIndex/genome.fa BEA17P061a_103_trimmed.fq.gz
[main] Real time: 219866.716 sec; CPU: 220966.991 sec
slurmstepd: error: *** JOB 7269 ON sens2017533-b107 CANCELLED AT 2018-09-30T16:10:57 DUE TO TIME LIMIT ***

Here's the corresponding row in the trace file:
440 e6/1e6649 7269 bwa (BEA17P061a_103) FAILED - 2018-09-26 16:10:18.603 4d 15m 52s 4d 15m 47s 276.9% 6.2 GB 6.4 GB 47.4 GB 41.9 GB

This looks very similar to this issue in the RNA-seq pipeline: nf-core/rnaseq#81

Can't find reference genomes for ngsplot

When I run on ngs.plot.r (using the hg19 genome), I get the following error:

 Error in CheckRegionAllowed(reg2plot, default.tbl) :
    Unknown region specified. Must be one of: bed
  Execution halted

It's the command

 ngs.plot.r \
      -G hg19 \
      -R genebody \
      -C ngsplot_config \
      -O Genebody \
      -D ensembl \
      -FL 300

throwing the error.

Some poking around, from within the singularity container

Running ngsplotdb.py shows that no genomes are installed for ngs.plot:

> ngsplotdb.py list
ID   Assembly Species  EnsVer   NPVer    InstalledFeatures    
>

But loading the nfcore-chipseq-1.0dev environment doesn't solve this, even though the hg19 genome is included the environment:

> source activate nfcore-chipseq-1.0dev
(nfcore-chipseq-1.0dev) > ngsplotdb.py list
ID   Assembly Species  EnsVer   NPVer    InstalledFeatures    
(nfcore-chipseq-1.0dev) > conda list
...
r-ngsplot                 2.63                          1    bioconda
r-ngsplotdb-hg19          3.00                          2    bioconda
r-ngsplotdb-hg38          3.00                          2    bioconda
r-ngsplotdb-mm10          3.00                          1    bioconda
...

Improve tests

Currently, because we're not using Human data, the tests don't run MACS2 or the blacklisting, or several of the pipeline processes.

Replace this data with something small from Human.

nf-core/chipseq blacklist error

Hi all,
The pipeline runned with "--genome GRCh37" (or GRCh38) and "-profile docker" options exits showing the error messsage "The requested file (.../GRCh37-blacklist.bed) could not be opened. Error message: (No such file or directory). Exiting!". If using --blacklist option with a custom right path the same error arises. Of course, paths and blacklist bed file format have been checked right.
Anyone could help or point to a fix? Thanks a lot in advance.

Nextflow always runs standard profile

Let me start off by saying I'm using Nextflow 0.28.2, it's what is available on my HPC environment. If that's the issue, then so be it. I'm attempting to use the chipseq pipeline on a slurm cluster with singularity. The command I'm running (paths have been anonymized, but they are correct):

nextflow run ../chipseq-master/ \
  --singleEnd --reads '/../../sample.fastq.gz' \
  --macsconfig '../chipseq-master/conf/macsconfig' \
  --fasta '../gatk-resource/genome.fa' \
  --gtf '../gatk-resource/refGene.gtf' \
  --broad --saturation \
  --outdir '../nf-core-chip-seq/' \
  --email '...' \
  -with-singularity '/../../nfcore-chipseq.img' \
  -profile my_profile

This returns:

N E X T F L O W  ~  version 0.28.2
Launching `../chipseq-master/main.nf` [gigantic_avogadro] - revision: b11db350eb
ERROR ~ Unknown config attribute: container -- check config file: ../chipseq-master/nextflow.config

null

 -- Check '.nextflow.log' file for details

And the log file:

Jul-10 14:49:27.510 [main] DEBUG nextflow.cli.Launcher - Setting http proxy: [dtn07-e0, 3128]
Jul-10 14:49:27.562 [main] DEBUG nextflow.cli.Launcher - Setting https proxy: [dtn07-e0, 3128]
Jul-10 14:49:27.562 [main] DEBUG nextflow.cli.Launcher - $> /usr/local/apps/nextflow/0.28.2/bin/nextflow run ../chipseq-master/ --singleEnd --reads /../sample.fastq.gz --macsconfig ../chipseq-master/conf/macsconfig
Jul-10 14:49:27.642 [main] INFO  nextflow.cli.CmdRun - N E X T F L O W  ~  version 0.28.2
Jul-10 14:49:28.351 [main] INFO  nextflow.cli.CmdRun - Launching `/home/capaldobj/nf-core/chipseq-master/main.nf` [crazy_keller] - revision: b11db350eb
Jul-10 14:49:28.369 [main] DEBUG nextflow.config.ConfigBuilder - Found config base: /home/capaldobj/nf-core/chipseq-master/nextflow.config
Jul-10 14:49:28.371 [main] DEBUG nextflow.config.ConfigBuilder - Parsing config file: /home/capaldobj/nf-core/chipseq-master/nextflow.config
Jul-10 14:49:28.379 [main] DEBUG nextflow.config.ConfigBuilder - Applying config profile: `standard`
Jul-10 14:49:28.651 [main] DEBUG nextflow.config.ConfigBuilder - In the following config object the attribute `container` is empty:
  version='1.4dev'
  nf_required_version='0.30.1'
  reads='/../sample.fastq.gz'
  macsconfig='../chipseq-master/conf/macsconfig'
  multiqc_config=../chipseq-master/conf/multiqc_config.yaml
  extendReadsLen=100
  notrim=false
  allow_multi_align=false
  singleEnd=true
  saveReference=false
  saveTrimmed=false
  saveAlignedIntermediates=false
  saturation=false
  broad=false
  blacklist_filtering=false
  outdir='./results'
  igenomes_base='s3://ngi-igenomes/igenomes/'
  email=false
  plaintext_email=false
  max_memory=128 GB
  max_cpus=16
  max_time=10d

Jul-10 14:49:28.658 [main] ERROR nextflow.cli.Launcher - Unknown config attribute: container -- check config file: ../chipseq-master/nextflow.config

null
nextflow.exception.ConfigParseException: Unknown config attribute: container -- check config file: ../chipseq-master/nextflow.config
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:83)
	at org.codehaus.groovy.reflection.CachedConstructor.doConstructorInvoke(CachedConstructor.java:77)
	at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrap.callConstructor(ConstructorSite.java:84)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:60)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:235)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:247)
	at nextflow.config.ConfigBuilder$_validate_closure7.doCall(ConfigBuilder.groovy:354)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
	at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
	at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294)
	at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
	at groovy.lang.Closure.call(Closure.java:414)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.callClosureForMapEntry(DefaultGroovyMethods.java:5276)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2117)
	at org.codehaus.groovy.runtime.dgm$164.invoke(Unknown Source)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoMetaMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:251)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:71)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
	at nextflow.config.ConfigBuilder.validate(ConfigBuilder.groovy:350)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
	at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
	at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:384)
	at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
	at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:69)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:174)
	at nextflow.config.ConfigBuilder$_validate_closure7.doCall(ConfigBuilder.groovy:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
	at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
	at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294)
	at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
	at groovy.lang.Closure.call(Closure.java:414)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.callClosureForMapEntry(DefaultGroovyMethods.java:5276)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2117)
	at org.codehaus.groovy.runtime.dgm$164.invoke(Unknown Source)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoMetaMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:251)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.call(PogoMetaMethodSite.java:71)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
	at nextflow.config.ConfigBuilder.validate(ConfigBuilder.groovy:350)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:210)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:59)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:174)
	at nextflow.config.ConfigBuilder.merge0(ConfigBuilder.groovy:324)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:210)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:59)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:182)
	at nextflow.config.ConfigBuilder.buildConfig0(ConfigBuilder.groovy:273)
	at nextflow.config.ConfigBuilder$buildConfig0$2.callCurrent(Unknown Source)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:174)
	at nextflow.config.ConfigBuilder.buildConfig(ConfigBuilder.groovy:243)
	at nextflow.config.ConfigBuilder$buildConfig$1.callCurrent(Unknown Source)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:166)
	at nextflow.config.ConfigBuilder.configObject(ConfigBuilder.groovy:570)
	at nextflow.config.ConfigBuilder.build(ConfigBuilder.groovy:583)
	at nextflow.script.ScriptRunner.<init>(ScriptRunner.groovy:116)
	at nextflow.cli.CmdRun.run(CmdRun.groovy:207)
	at nextflow.cli.Launcher.run(Launcher.groovy:428)
	at nextflow.cli.Launcher.main(Launcher.groovy:582)

No matter what I do with the nextflow.config or with the -profile option, nextflow always runs the standard profile. I've even replaced the config in the standard profile with my configs, and it still doesn't work. Additionally, I changed the container param to point to my local singularity image. Am I doing something wrong in terms of specifying a profile?

Unknown config attribute: wf_container

Hello,

I'm trying to run workflow using the docker image, but I got the following error:
Unknown config attribute: wf_container.

Could you help me please?

.nextflow.log

mai-16 15:25:05.866 [main] INFO  nextflow.cli.CmdRun - N E X T F L O W  ~  version 0.28.2
mai-16 15:25:06.206 [main] INFO  nextflow.cli.CmdRun - Launching `/home/houtan/my-pipelines/chipseq/main.nf` [spontaneous_kilby] - revision: 571a81
ab77
mai-16 15:25:06.215 [main] DEBUG nextflow.config.ConfigBuilder - Found config base: /home/houtan/my-pipelines/chipseq/nextflow.config
mai-16 15:25:06.216 [main] DEBUG nextflow.config.ConfigBuilder - Parsing config file: /home/houtan/my-pipelines/chipseq/nextflow.config
mai-16 15:25:06.223 [main] DEBUG nextflow.config.ConfigBuilder - Applying config profile: `docker`
mai-16 15:25:06.310 [main] DEBUG nextflow.config.ConfigBuilder - In the following config object the attribute `wf_container` is empty:
  shell=['/bin/bash', '-euo', 'pipefail']
  executor='local'
  $executor {
  }
  time=2d

mai-16 15:25:06.316 [main] ERROR nextflow.cli.Launcher - Unknown config attribute: wf_container -- check config file: /home/houtan/my-pipelines/chi
pseq/nextflow.config

script

nextflow run  ~/my-pipelines/chipseq \
     -profile docker \
     --macsconfig 'macssetup.config' \
     --bwa_index "~/genome/hg38/bwa_index/" \
     --fasta "/home/houtan/genome/hg38/GRCh38.primary_assembly.genome.fa" \
     --gtf "/home/houtan/genome/hg38/gencode.v28.primary_assembly.annotation.gtf" \
     --max_cpus 1 \
     --reads 'data/*{1,2}.fastq.gz' \
     --outdir 'results_hg38_gencodev28/'

GTF and BED files into Channels

I have tested to have GTF and BED files into channels. However, it somehow affected the channel of input FastQ files that no file names could be returned. This issue fails the validation of samples in MACS config file. When I commented out the scripts for validation the pipeline could still run without problem. It means that the input FastQ files were still there but somehow no file names could be obtained.

When I revered the codes and put GTF and BED as file, there was no more problem.

Update documentation

There is a lot of old outdated documentation in this pipeline. This needs resolving before the v1.0 release.

All MACS jobs get the same job name

This is just a small thing, but currently all MACS jobs get the same job name. It would be good to give each job a unique name (from the macs config file).

error when running with profile test

I just tried running the test profile and got the following error:

nextflow run nf-core/chipseq -profile test
and received this error:

Command executed:

  check_design.py design.csv design_reads.csv design_controls.csv

Command exit status:
  1

Command output:
  (empty)

Command error:
    File "/home/ryan/.nextflow/assets/nf-core/chipseq/bin/check_design.py", line 46
      print "{} header: {} != {}".format(ERROR_STR,','.join(header),','.join(HEADER))
                                ^
  SyntaxError: invalid syntax

Do I need to provide it a design file? or does the test download one?

Thanks!

plotFingerprint fails with too many bam files

With experiment of 36 samples, plotFingerprint fails. Scaling back plotFingerprint to 3 bam files removed error. Should be more rigorously tested to figure out maximum number of bam files or if one of my bam files was corrupted. Ideally, pipeline would use macs.config file to plotFingerprint in groups or just each sample against its respective input.

Example error section:

ERROR ~ Error executing process > 'deepTools (2018_051_S5_R1_001.dedup.sorted)'
Caused by:
  Process `deepTools (2018_051_S5_R1_001.dedup.sorted)` terminated with an error exit status (1)
Command executed:
  plotFingerprint \
      -b 2018_051_S5_R1_001.dedup.sorted.bam 2018_054_S7_R1_001.dedup.sorted.bam 2018_046_S1_R1_001.dedup.sorted.bam 2018_037_S1_R1_001.dedup.sorted.bam 2018_042_S9_R1_001.dedup.sorted.bam 2018_043_S6_R1_001.dedup.sorted.bam 2018_040_S4_R1_001.dedup.sorted.bam 2018_038_S2_R1_001.dedup.sorted.bam 2018_053_S8_R1_001.dedup.sorted.bam 2018_048_S9_R1_001.dedup.sorted.bam 2018_044_S7_R1_001.dedup.sorted.bam 2018_039_S3_R1_001.dedup.sorted.bam 2018_052_S6_R1_001.dedup.sorted.bam 2018_045_S8_R1_001.dedup.sorted.bam 2018_050_S4_R1_001.dedup.sorted.bam 2018_041_S5_R1_001.dedup.sorted.bam 2018_047_S2_R1_001.dedup.sorted.bam 2018_049_S3_R1_001.dedup.sorted.bam 2018_030_S3_R1_001.dedup.sorted.bam 2018_034_S7_R1_001.dedup.sorted.bam 2018_033_S6_R1_001.dedup.sorted.bam 2018_031_S4_R1_001.dedup.sorted.bam 2018_035_S8_R1_001.dedup.sorted.bam 2018_029_S2_R1_001.dedup.sorted.bam 2018_036_S9_R1_001.dedup.sorted.bam 2018_032_S5_R1_001.dedup.sorted.bam 2018_028_S1_R1_001.dedup.sorted.bam 2018_022_S4_R1_001.dedup.sorted.bam 2018_027_S9_R1_001.dedup.sorted.bam 2018_021_S3_R1_001.dedup.sorted.bam 2018_024_S6_R1_001.dedup.sorted.bam 2018_020_S2_R1_001.dedup.sorted.bam 2018_023_S5_R1_001.dedup.sorted.bam 2018_019_S1_R1_001.dedup.sorted.bam 2018_026_S8_R1_001.dedup.sorted.bam 2018_025_S7_R1_001.dedup.sorted.bam \
      --plotFile fingerprints.pdf \
      --outRawCounts fingerprint.txt \
      --extendReads 100 \
      --skipZeros \
      --ignoreDuplicates \
      --numberOfSamples 50000 \
      --binSize 500 \
      --plotFileFormat pdf \
      --plotTitle "Fingerprints"
  
  for bamfile in 2018_051_S5_R1_001.dedup.sorted.bam 2018_054_S7_R1_001.dedup.sorted.bam 2018_046_S1_R1_001.dedup.sorted.bam 2018_037_S1_R1_001.dedup.sorted.bam 2018_042_S9_R1_001.dedup.sorted.bam 2018_043_S6_R1_001.dedup.sorted.bam 2018_040_S4_R1_001.dedup.sorted.bam 2018_038_S2_R1_001.dedup.sorted.bam 2018_053_S8_R1_001.dedup.sorted.bam 2018_048_S9_R1_001.dedup.sorted.bam 2018_044_S7_R1_001.dedup.sorted.bam 2018_039_S3_R1_001.dedup.sorted.bam 2018_052_S6_R1_001.dedup.sorted.bam 2018_045_S8_R1_001.dedup.sorted.bam 2018_050_S4_R1_001.dedup.sorted.bam 2018_041_S5_R1_001.dedup.sorted.bam 2018_047_S2_R1_001.dedup.sorted.bam 2018_049_S3_R1_001.dedup.sorted.bam 2018_030_S3_R1_001.dedup.sorted.bam 2018_034_S7_R1_001.dedup.sorted.bam 2018_033_S6_R1_001.dedup.sorted.bam 2018_031_S4_R1_001.dedup.sorted.bam 2018_035_S8_R1_001.dedup.sorted.bam 2018_029_S2_R1_001.dedup.sorted.bam 2018_036_S9_R1_001.dedup.sorted.bam 2018_032_S5_R1_001.dedup.sorted.bam 2018_028_S1_R1_001.dedup.sorted.bam 2018_022_S4_R1_001.dedup.sorted.bam 2018_027_S9_R1_001.dedup.sorted.bam 2018_021_S3_R1_001.dedup.sorted.bam 2018_024_S6_R1_001.dedup.sorted.bam 2018_020_S2_R1_001.dedup.sorted.bam 2018_023_S5_R1_001.dedup.sorted.bam 2018_019_S1_R1_001.dedup.sorted.bam 2018_026_S8_R1_001.dedup.sorted.bam 2018_025_S7_R1_001.dedup.sorted.bam
  do
      bamCoverage \
        -b $bamfile \
        --extendReads 100 \
        --normalizeUsing RPKM \
        -o ${bamfile}.bw
  done
  
  multiBamSummary \
      bins \
      --binSize 10000 \
      --bamfiles 2018_051_S5_R1_001.dedup.sorted.bam 2018_054_S7_R1_001.dedup.sorted.bam 2018_046_S1_R1_001.dedup.sorted.bam 2018_037_S1_R1_001.dedup.sorted.bam 2018_042_S9_R1_001.dedup.sorted.bam 2018_043_S6_R1_001.dedup.sorted.bam 2018_040_S4_R1_001.dedup.sorted.bam 2018_038_S2_R1_001.dedup.sorted.bam 2018_053_S8_R1_001.dedup.sorted.bam 2018_048_S9_R1_001.dedup.sorted.bam 2018_044_S7_R1_001.dedup.sorted.bam 2018_039_S3_R1_001.dedup.sorted.bam 2018_052_S6_R1_001.dedup.sorted.bam 2018_045_S8_R1_001.dedup.sorted.bam 2018_050_S4_R1_001.dedup.sorted.bam 2018_041_S5_R1_001.dedup.sorted.bam 2018_047_S2_R1_001.dedup.sorted.bam 2018_049_S3_R1_001.dedup.sorted.bam 2018_030_S3_R1_001.dedup.sorted.bam 2018_034_S7_R1_001.dedup.sorted.bam 2018_033_S6_R1_001.dedup.sorted.bam 2018_031_S4_R1_001.dedup.sorted.bam 2018_035_S8_R1_001.dedup.sorted.bam 2018_029_S2_R1_001.dedup.sorted.bam 2018_036_S9_R1_001.dedup.sorted.bam 2018_032_S5_R1_001.dedup.sorted.bam 2018_028_S1_R1_001.dedup.sorted.bam 2018_022_S4_R1_001.dedup.sorted.bam 2018_027_S9_R1_001.dedup.sorted.bam 2018_021_S3_R1_001.dedup.sorted.bam 2018_024_S6_R1_001.dedup.sorted.bam 2018_020_S2_R1_001.dedup.sorted.bam 2018_023_S5_R1_001.dedup.sorted.bam 2018_019_S1_R1_001.dedup.sorted.bam 2018_026_S8_R1_001.dedup.sorted.bam 2018_025_S7_R1_001.dedup.sorted.bam \
      -out multiBamSummary.npz \
      --extendReads 100 \
      --ignoreDuplicates \
      --centerReads
  
  plotCorrelation \
      -in multiBamSummary.npz \
      -o scatterplot_PearsonCorr_multiBamSummary.png \
      --outFileCorMatrix scatterplot_PearsonCorr_multiBamSummary.txt \
      --corMethod pearson \
      --skipZeros \
      --removeOutliers \
      --plotTitle "Pearson Correlation of Read Counts" \
      --whatToPlot scatterplot
  
  plotCorrelation \
      -in multiBamSummary.npz \
      -o heatmap_SpearmanCorr_multiBamSummary.png \
      --outFileCorMatrix heatmap_SpearmanCorr_multiBamSummary.txt \
      --corMethod spearman \
      --skipZeros \
      --plotTitle "Spearman Correlation of Read Counts" \
      --whatToPlot heatmap \
      --colorMap RdYlBu \
      --plotNumbers
  
  plotPCA \
      -in multiBamSummary.npz \
      -o pcaplot_multiBamSummary.png \
      --plotTitle "Principal Component Analysis Plot" \
      --outFileNameData pcaplot_multiBamSummary.txt
Command exit status:
  1
Command output:
  (empty)
Command error:
  Traceback (most recent call last):
    File "/usr/local/apps/deeptools/3.1.3/bin/plotFingerprint", line 12, in <module>
      main(args)
    File "/usr/local/Anaconda/envs_app/deeptools/3.1.3/lib/python3.6/site-packages/deeptools/plotFingerprint.py", line 421, in main
      plt.plot(x, count, label=args.labels[i], linestyle=pyplot_line_styles[j])
  IndexError: list index out of range

markdown_to_html.r markdown built with different version : using local lib?

I have an error trying to run the chipseq pipeline via singularity. It complains that the markdown library has been built using a different version of R ( which happens to be same as what is in my local user library). It then tries to install it into my local library but (thankfully) make is not on the path.

'nfcore-chipseq-1.0.0.img'

ERROR ~ Error executing process > 'output_documentation'

Caused by:
  Process `output_documentation` terminated with an error exit status (1)

Command executed:

  markdown_to_html.r output.md results_description.html

Command exit status:
  1

Command output:
  (empty)

Command error:
  Loading required package: markdown
  Failed with error:  ‘package ‘markdown’ was installed by an R version with different internals; it needs to be reinstalled for use with this R version’
  In addition: Warning message:
  package ‘markdown’ was built under R version 3.5.3
  Installing package into #<local path>#   R/x86_64-pc-linux-gnu-library/3.5’
  (as ‘lib’ is unspecified)
  trying URL 'http://cloud.r-project.org/src/contrib/markdown_1.0.tar.gz'
  Content type 'application/x-gzip' length 80843 bytes (78 KB)
  ==================================================
  downloaded 78 KB

  * installing *source* package ‘markdown’ ...
  ** package ‘markdown’ successfully unpacked and MD5 sums checked
  ** libs
  sh: 1: make: not found
  ERROR: compilation failed for package ‘markdown’
  * removing #<local path>#   R/x86_64-pc-linux-gnu-library/3.5/markdown’
  * restoring previous ‘#<local path>#   R/x86_64-pc-linux-gnu-library/3.5/markdown’

  The downloaded source packages are in
        ‘/tmp/311578/RtmptEzm3X/downloaded_packages’
  Error: package ‘markdown’ was installed by an R version with different internals; it needs to be reinstalled for use with this R version
  In addition: Warning messages:
  1: In install.packages("markdown", dependencies = TRUE, repos = "http://cloud.r-project.org/") :
    installation of package ‘markdown’ had non-zero exit status
  2: package ‘markdown’ was built under R version 3.5.3
  Execution halted

Filter out MACS results with zero peaks

I'm getting this error when I'm running ChIP-Seq Best Practice v1.0 with nextflow v 0.31.1, on Uppmax/ bianca.

ERROR ~ Error executing process > 'chippeakanno (ALBB13_137_H3K27me3_peaks)'

Caused by:
  Process `chippeakanno (ALBB13_137_H3K27me3_peaks)` terminated with an error exit status (1)

Command executed:

  post_peak_calling_processing.r /usr/local/lib/R/library /home/jacke/bin/chipseq/blacklists/hg19-blacklist.bed genes.gtf <a lot of data sets here..>

Command exit status:
  1

Command output:
  (empty)

Command error:
      parLapplyLB, parRapply, parSapply, parSapplyLB
  
  The following objects are masked from ‘package:stats’:
  
      IQR, mad, sd, var, xtabs
  
  The following objects are masked from ‘package:base’:
  
      anyDuplicated, append, as.data.frame, cbind, colMeans, colnames,
      colSums, do.call, duplicated, eval, evalq, Filter, Find, get, grep,
      grepl, intersect, is.unsorted, lapply, lengths, Map, mapply, match,
      mget, order, paste, pmax, pmax.int, pmin, pmin.int, Position, rank,
      rbind, Reduce, rowMeans, rownames, rowSums, sapply, setdiff, sort,
      table, tapply, union, unique, unsplit, which, which.max, which.min
  
  Loading required package: S4Vectors
  
  Attaching package: ‘S4Vectors’
  
  The following object is masked from ‘package:base’:
  
      expand.grid
  
  Loading required package: IRanges
Loading required package: GenomeInfoDb
  Loading required package: ChIPpeakAnno
  Loading required package: grid
  Loading required package: Biostrings
  Loading required package: XVector
  
  Attaching package: ‘Biostrings’
  
  The following object is masked from ‘package:base’:
  
      strsplit
  
  Loading required package: VennDiagram
  Loading required package: futile.logger
  
  Warning message:
  In read.dcf(con) :
    URL 'http://bioconductor.org/BiocInstaller.dcf': status was 'Couldn't resolve host name'
  Loading required package: rtracklayer
  Loading required package: doParallel
  Loading required package: foreach
  Loading required package: iterators
  Error in S4Vectors:::normalize_names_replacement_value(value, x) : 
    attempt to set too many names (2) on IRanges object of length 0
  Calls: annotatePeakInBatch ... names<- -> names<- -> names<- -> names<- -> <Anonymous>
  Execution halted

I used the following command to run:

nohup nextflow run /home/jacke/bin/chipseq \
    --singleEnd \
     --reads '*.fastq.gz' \
     --genome GRCh37 \
     --project sens2017533 \
     --macsconfig 'macssetup.config' \
     -resume \
     -with-trace \
     -with-dag flowchart.pdf \
      --broad \
     --blacklist_filtering \
     -profile uppmax \
     -with-singularity /home/jacke/bin/singularity_images/nfcore-chipseq.img &

And most, but not all, MACS 2 jobs have finished ok.

EDIT: Two of the peak files from MACS contain no peaks. That is probably what gives the error.

Validate macsconfig inputs before starting

Moved from SciLifeLab#65:

Currently, we check that all FastQ input files exist (--reads) and that the macsconfig file exists (--macsconfig). However, we don't check that the sample names defined in the macsconfig file correspond to the FastQ files submitted. If wrong, this results in the pipeline running nearly to completion and then failing with a slightly obscure error message in the MACS peak calling step.

Instead, it would be good to validate that these two correspond with one another before any pipeline tasks are started, exiting with an error message immediately if something is wrong.

Add IDR analysis

Would be nice to add the IDR analysis that is currently carried out by the ENCODE pipelines:
https://github.com/kundajelab/atac-seq-pipeline
https://docs.google.com/document/d/1f0Cm4vRyDQDu0bMehHD7P7KOMxTOP-HiNoIvL1VcBt8/edit

Another implementation in NF:
https://github.com/DoaneAS/atacflow

ENCODE ATAC-seq Guidelines:
https://www.encodeproject.org/atac-seq/

Amongst other things, this will probably also involve creating pseudo-replicates by merging and equally sampling alignments across replicates.

See: nf-core/atacseq#36

Catch MACS error

Sometimes MACS fails because the data are so bad that it can make a model of the peaks. The error message Too few paired peaks (0) so I can not build the model! is given. This causes the entire pipeline to crash, even if there is only one bad data set.

A better solution would perhaps be to report this error, but then continue?

Issue with bwa index building

The default -a is option works with database no larger than 2GB. For whole human genome -a bwtsw is needed. We can add a clarification in README.

Problems building singularity image: Authentication error.

I'm having problems building the singularity image for the ChIP-seq pipeline. I get the same error both on my macbook and running on Uppmax/Rackham.

$ singularity pull --name nfcore-chipseq.img docker://nf-core/ChIPseq 
WARNING: pull for Docker Hub is not guaranteed to produce the
WARNING: same image on repeated pull. Use Singularity Registry
WARNING: (shub://) to pull exactly equivalent images.
ERROR Authentication error, exiting.
Cleaning up...
ERROR: pulling container failed!

$ singularity pull --name nfcore-chipseq.img shub://nf-core/ChIPseq 
ERROR Cannot find image. Is your capitalization correct?

Looking at this page, perhaps the name should be docker://nfcore/ChIPseq (without the "-"): https://hub.docker.com/r/nfcore/chipseq/ But this gives the same results as above.

Building the old singularity image docker://scilifelab/ngi-chipseq still works ok.

Remove uppmax modules profile

We should not encourage users to use the uppmax modules configuration. Once we have the singularity installation errors fixed and the singularity profile works, remove the uppmax-modules profile.

Make conda environment.yml

It would be great if we could manage the pipeline software requirements with a single conda environment.yml file. Then we can build the Docker and Singularity containers from this.

I have started a branch to work on this together: https://github.com/nf-core/ChIPseq/tree/bioconda

Note that not all packages are available on conda yet, so we may need to package some ourselves.

  • Find all packages in conda
  • Complete environment.yml file and test it installs
  • Check whether the Docker image builds properly
  • Check whether the pipeline still works (almost certainly not)
    • Adjust pipeline commands / scripts as necessary
  • Strip out all previous custom R installation stuff / environment module stuff
  • Think more about whether we want to be installing reference genomes into the containers
    • Hint: I would prefer not to. So we need to look into how we can use regular references (eg. iGenomes) and adjust scripts accordingly.

Refactor MACS channel handling

At the moment, the macs process links in all BAM files and then runs on just a subset:

chipseq/main.nf

Lines 731 to 738 in 9ecdcf4

process macs {
tag "${bam_for_macs[0].baseName}"
publishDir "${params.outdir}/macs", mode: 'copy'
input:
file bam_for_macs from bam_dedup_macs.collect()
file bai_for_macs from bai_dedup_macs.collect()
set chip_sample_id, ctrl_sample_id, analysis_id from macs_para

This is not very good practice. It would be better to filter the bam channel according to the macs config and then run each MACS task with just the BAM files that will be analysed.

BWA can't find bwa index or fasta file

Hi,

I'm trying to run the chipseq pipeline in a personal computer. The pipeline is running ok until it reaches BWA aligning step. I get this error:

[Caused by:
  Process `bwa (H3K27ac_DMSO)` terminated with an error exit status (1)
Command executed:
  bwa mem -M hg19/genome.fa H3K27ac_DMSO_trimmed.fq.gz | samtools view -bT hg19 - | samtools view -b -q 1 -F 4 -F 256 > H3K27ac_DMSO.bam
Command exit status:
  1
Command output:
  (empty)
Command error:
  WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
  [E::bwa_idx_load_from_disk] fail to locate the index files
  [samfaipath] build FASTA index...
  [E::fai_build3] Failed to open the FASTA file hg19
  [samfaipath] fail to build FASTA index.

This is my code:

nextflow run nf-core/chipseq \
    -profile standard,docker \
    --max_memory '30.GB' \
    --singleEnd \
    --bwa_index '/media/gema/1C2AD20F2AD1E5B4/NGS_analysis/hg19/' \
    --blacklist_filtering \
    --blacklist '/home/gema/.nextflow/assets/nf-core/chipseq/blacklists' \
    --genome GRCh37 \
    --reads '/media/gema/1C2AD20F2AD1E5B4/NGS_analysis/GSE58740/FASTQ_files/chip-seq/*.fastq' \
    --macsconfig '/media/gema/1C2AD20F2AD1E5B4/NGS_analysis/GSE58740/FASTQ_files/chip-seq/macssetup.config'

I also added this to the nextflow.config file:

genomes {
    'GRCh37' {
        bwa = '/media/gema/1C2AD20F2AD1E5B4/NGS_analysis/hg19'
        fasta = '/media/gema/1C2AD20F2AD1E5B4/NGS_analysis/hg19/GRCh37.fa' // used if bwa index not given
    }
    // Any number of additional genomes, key is used with --genome
  }
}

Why BWA is running bwa mem -M hg19/genome.fa if I indicated the pathway to the file everywhere?

Thank you very much in advance

Best
Gema

Update pipeline versions

There are multiple versions used for this pipeline, plus the old changelog. I suggest that we draw a line under the previous history, rename everything to version 1.0 and clear the old changelog. We can point to the previous repository before that.

Issue with --notrim and permissions violation.

Error 1 - unsolved
When setting --notrim, I get the error

ERROR ~ No such variable: read_files_trimming

 -- Check script 'main.nf' at line: 342 or see '.nextflow.log' file for more details

Error 2 - solved
Reporting this error as it may help others when getting started with this. Not sure where to add it in the docs, so I'll leave it here. I had an error when running nextflow run nf-core/chipseq or nextflow pull nfcore/chipseq in Uppmax (module load bioinfo-tools Nextflow):
nextflow pull nf-core/chipseq throw the same error: ERROR ~ Creating directories for /sw/apps/bioinfo/Nextflow/18.10.1/rackham/nxf_home/assets/nf-core/chipseq/.git failed

This was solved, with help from @alneberg and @maxulysse by setting

export NXF_LAUNCHER=$SNIC_TMP
export NXF_TEMP=$SNIC_TMP
export NXF_HOME=/my/own/path/in/rackham/ 

ERROR ~ No such variable: GRCh37

I'm getting this problem with both last versions of the workflow and nextflow
ERROR ~ No such variable: GRCh37

N E X T F L O W  ~  version 0.30.2
Launching `./chipseq/main.nf` [kickass_ptolemy] - revision: b11db350eb
ERROR ~ No such variable: GRCh37
nextflow run  ./chipseq/ \
     -profile docker \
     --max_memory '31.GB' \
     --macsconfig 'macssetup.config' \
     --genome GRCh37 \
     --reads 'data/*{1,2}.fastq.gz' \
     -resume \
     --email  '[email protected]' \
     --outdir 'results_hg19/'

Incorrect default blacklist files

A blacklist files are provided by the ENCODE for a selection of organisms:
https://sites.google.com/site/anshulkundaje/projects/blacklists/

These rely on the fact that the assembly for the blacklist matches the one you are using with the --genome parameter. This is not the case for:

'WBcel235' { bwa = "${params.igenomes_base}/Caenorhabditis_elegans/Ensembl/WBcel235/Sequence/BWAIndex/"
blacklist = "${baseDir}/blacklists/ce10-blacklist.bed"
gtf = "${params.igenomes_base}/Caenorhabditis_elegans/Ensembl/WBcel235/Annotation/Genes/genes.gtf"
bed = "${params.igenomes_base}/Caenorhabditis_elegans/Ensembl/WBcel235/Annotation/Genes/genes.bed"
macsgsize = "9e7"
}
'BDGP6' { bwa = "${params.igenomes_base}/Drosophila_melanogaster/Ensembl/BDGP6/Sequence/BWAIndex/"
blacklist = "${baseDir}/blacklists/dm3-blacklist.bed"
gtf = "${params.igenomes_base}/Drosophila_melanogaster/Ensembl/BDGP6/Annotation/Genes/genes.gtf"
bed = "${params.igenomes_base}/Drosophila_melanogaster/Ensembl/BDGP6/Annotation/Genes/genes.bed"
macsgsize = "1.2e8"
}

i.e. WBcel235 != ce10 and BDGP6 != dm3.

These will either have to be removed or maybe can be used in conjuction with liftOver to generate the appropriate coordinates.

MACS effective genome size

For the moment, the workflow sets the -g from macs2 only for GRCh37 and GRCm38
https://github.com/nf-core/chipseq/blob/master/main.nf#L197-L198.

If you read macs manual (https://github.com/taoliu/MACS) this parameter has some precompiled parameters. But there are some ways to calculate this genome size as described here: https://deeptools.readthedocs.io/en/develop/content/feature/effectiveGenomeSize.html.

Since this -g is a number that could be either given by the user or somehow calculated, the pipeline should give more freedom so users can provide their own values, or even add more options in the workflow.

I understand that ngsplot depends on a external database (hg19, hg38 and mm10 were included in the bioconda recipe), and its use would be dependable on that. But I don't see any reason not allow users use macs2 if they give their own effective genome size.

Update config file parsing

The pipeline needs updating to use the new nf-core/configs repository and handling of default / igenomes configs.

Currently igenomes can't be used without using one of the predefined profiles, which is silly.

Update core pipeline code

The pipeline has fallen behind some of the more recent updates to the nf-core template. Apply updates until the linting tests are passing again.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.