Giter Site home page Giter Site logo

hic-bench's People

Contributors

gongyixiao avatar hc27oclock avatar igordot avatar javrodriguez avatar sofnom avatar stevekm avatar teosakel avatar tsirigos avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hic-bench's Issues

how to set the params.tcsh file

Dear professor,
when I set my params.tcsh file ,the erro below is occured. I do not know how to set the line in module load .
########################################
$cat inputs/params/params.tcsh
#!/bin/tcsh

#unload
module unload samtools
module unload java
module unload gcc
module unload python
module unload r

#load basic tools
#module load python/2.7.3
module load /share/nas2/genome/biosoft/samtools/1.3.1/
module load /share/nas2/genome/biosoft/bedtools/2.17.0/
module load /share/nas2/genome/biosoft/java/jdk/1.8.0_31/
module load /share/nas2/genome/biosoft/R/3.2.2/

######################################
note: the softwares position in my computer is located in share/nas2/genome/biosoft/samtools/1.3.1/,
/share/nas2/genome/biosoft/bedtools/2.17.0/,/share/nas2/genome/biosoft/java/jdk/1.8.0_31/,/share/nas2/genome/biosoft/R/3.2.2/

报错信息:
ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for '/share/nas2/genome/biosoft/samtools/1.3.1'
ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for '/share/nas2/genome/biosoft/bedtools/2.17.0'
ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for '/share/nas2/genome/biosoft/java/jdk/1.8.0_31'
ModuleCmd_Load.c(208):ERROR:105: Unable to locate a modulefile for '/share/nas2/genome/biosoft/R/3.2.2'

Dependencies: matlab and ghmm

When I install HiC-bench,the software: matlab / R2013a I could not get. if I do not install them, which section of the the pipeline will be broken?
The software :ghmm/ 0.9,Could give me the download site?

count_peaks bug

There is a problem with the count_peaks step that you added. At the beginning, you are reading the input variables:
source ./code/code.main/scripts-read-job-vars $branch "$objects" "genome genome_dir"

But then you are just scanning the directory manually, so you are ignoring them.

Simple example where this makes a difference:
All the peak counts are currently reported, including inputs, which doesn’t really make sense. Inputs should be ignored. It looks like you used the chipseq-diffbind step as an example (there is still chipseq-diffbind code in there). If you check the chipseq-diffbind step, it ignores the inputs.

I think the relevant part is the foreach obj ($objects) loop.

./results/.db/run: No such file or directory

Hi,

I set up my pipeline and installed all the dependency correctly, but it popped up error when I try "run" to run __01 a-aglign command:
_[ydw529@qnode4144 _01a-align]$ ./run
=== Operation = align =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
cat: ./results/.db/run: No such file or directory
cat: ./results/.db/run: No such file or directory

I then tried ./run.dry, and it shows the follow message:

[ydw529@qnode4144 project_test]$ ./run.dry
Validating sample sheet...
PIPELINE STARTING: Tue May 24 13:50:33 CDT 2016
=== Operation = align =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = filter =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = filter-stats =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = tracks =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = matrix-filtered =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = matrix-prep =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = matrix-ic =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = matrix-hicnorm =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = matrix-stats =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = compare-matrices =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = compare-matrices-stats =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = boundary-scores =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = boundary-scores-pca =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = domains =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = domains-stats =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = compare-boundaries =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = compare-boundaries-stats =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = hicplotter =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = interactions =============
USAGE: pipeline-master-explorer.r [OPTIONS] SCRIPT OUTDIR-PREFIX PARAM-SCRIPTS INPUT-BRANCHES SPLIT-VARIABLE OUTPUT-OBJECT-VARIABLE TUPLES
=== Operation = annotations =============
**Warning: .db directory exists, removing!
Error: parameter file list is empty!
**

What do you think could be the problem here?

TAD activity

Hi, there are some questions:

  1. I do not find that how do you define TAD activity and how do you calculate the TAD activity expression? There is no RNA-seq expression data.
  2. The find-consistent-domains.pl can generate the sample1 specific TAD files, sample2 specific and common TAD files. And then you use this different type TAD domain to calculate the TAD activity and TAD activity expression?

Thank you.

qsub doesn't work

Hi

When we ran pipeline-execute command. It popped up the following error. Our cluster doesn't use OGS/ Grid Engine 2011.11 as scheduler, we use PBS to schedule jobs and msub to submit jobs. Would this be the reason? if it is, how would you recommend to deal with this situation?

Thanks!

_ydw529@quser12 project2]$ ./code.main/pipeline-execute project [email protected]
[qsub: illegal -c value]

usage: qsub [-a date_time] [-A account_string] [-b secs]
    [-c [ none | { enabled | periodic | shutdown |
    depth=<int> | dir=<path> | interval=<minutes>}... ]
    [-C directive_prefix] [-d path] [-D path]
    [-e path] [-h] [-I] [-j oe|eo|n] [-k {oe}] [-l resource_list] [-m n|{abe}]
    [-M user_list] [-N jobname] [-o path] [-p priority] [-P proxy_user [-J <jobid]]
    [-q queue] [-r y|n] [-S path] [-t number_to_submit] [-T type]  [-u user_list]
    [-w] path
      [-W additional_attributes] [-v variable_list] [-V ] [-x] [-X] [-z] [script]

[ydw529@quser12 project2]$ pwd
/home/ydw529/panos/HiC/hic-bench/project2_

compile error

Hi.

I got the following error during make in code / src:
How do I resolve this?

/usr/lib/gcc/x86_64-linux-gnu/4.8/../../../../lib/libgsl.so: undefined reference to cblas_dtrsv' /usr/lib/gcc/x86_64-linux-gnu/4.8/../../../../lib/libgsl.so: undefined reference to cblas_zher2'
/usr/lib/gcc/x86_64-linux-gnu/4.8/../../../../lib/libgsl.so: undefined reference to cblas_sasum' /usr/lib/gcc/x86_64-linux-gnu/4.8/../../../../lib/libgsl.so: undefined reference to cblas_zaxpy'
collect2: error: ld returned 1 exit status
make: *** [gtools-hic] Error 1

SLURM Support?

Hi,

This pipeline looks great for running Hi-C analyses--just wondering, do you currently or do you expect to in the future have support for other HPC cluster workload managers? My cluster uses SLURM and scientific linux and I've been having a difficult time getting it to work there. Would I just (hopefully) need to modify all the qsub options and commands in the qsub files in /code.main to be their equivalent SLURM commands? The other big difference I'd see is in loading of the modulefiles and what is available, but I think it's much easier to work around that.

Thanks,
Ittai

Compartment and subcompartment analysis

Dear professor,
Could you provide the compartment and subcompartment analysis tools?
Also ,there is no loop analysis tools.
if both have, the pipeline will be perfect.

'run' scripts do not handle 0 byte files properly

If a pipeline step tries to run and touches output files, or otherwise creates an output file of 0 byte size before breaking, subsequent re-runs of the pipeline step will see this output file and consider it 'up-to-date'. Might be useful to have the pipeline step output checker detect this and delete 0B files

Genome location error in Loops

Hello,

I believe there's a bug in the loops step: I get the following error
Error: Can't open genome file */__14a-loops/inputs/genomes/dm6/genome/bowtie2.index/genome.fa.fai Exiting...

The correct path is */__14a-loops/inputs/genomes/dm6/bowtie2.index/genome.fa.fai so I'm assuming the code points to an incorrect genome file location, I tried finding where rapidly but did not...

Thanks

Docker Image?

Hello NYU-BFX team,

Do you have any plans to make a Docker image for hic bench?
Thanks for taking the time to read this request!

Sincerely,
Jaclyn

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.