bio-ontology-research-group / phenomenet-vp Goto Github PK
View Code? Open in Web Editor NEWA phenotype-based tool for variant prioritization in WES and WGS data
A phenotype-based tool for variant prioritization in WES and WGS data
Hi, I'm trying to use phenomenet-vp, however I'm running into an error.
I extracted the folder data-v2.1.tar.gz inside the folder phenomenet-vp-2.1/, but when I try running the command:
/<path>/phenomenet-vp-2.1/bin/phenomenet-vp -f /<path>/phenomenet-vp-2.1/data/Miller.vcf -o OMIM:263750 --human --digenic
I get this error:
INFO: Initializing the model
Apr 29, 2022 12:13:57 PM sa.edu.kaust.Main runTool
SEVERE: java.nio.file.NoSuchFileException: data/omim_mode.txt
I also tried to move the file omim_mode.txt in the following folders:
/<path>/phenomenet-vp-2.1/data/omim_mode.txt
/<path>/phenomenet-vp-2.1/data-v2.1/omim_mode.txt
/<path>/phenomenet-vp-2.1/data-v2.1/data/omim_mode.txt
/<path>/phenomenet-vp-2.1/omim_mode.txt
However I get the same error.
What would be the precise folder architecture I should use?
Is there a way to specify the path to the data/ folder?
Thanks.
When running DeepPVP the program is stalling at the annotation step. Any ideas what may be causing this? The program just continues to output the below listed message about 100 times and will run until walltime runs out. Thanks in advance!
INFO: Starting annotation java.lang.NumberFormatException: For input string: "NA" at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:2043) at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110) at java.lang.Double.parseDouble(Double.java:538) at sa.edu.kaust.Annotations$1.apply(Annotations.java:206) at sa.edu.kaust.Annotations$1.apply(Annotations.java:121) at java.util.Arrays.lambda$parallelSetAll$0(Arrays.java:4718) at java.util.stream.ForEachOps$ForEachOp$OfInt.accept(ForEachOps.java:205) at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:291) at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731) at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Hi,
When I run the tool with command
bin/phenomenet-vp -f data/Miller.vcf -o OMIM:263750
I get this error the following
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.regex.Pattern.compile(Pattern.java:1688)
at java.util.regex.Pattern.(Pattern.java:1351)
at java.util.regex.Pattern.compile(Pattern.java:1028)
at java.lang.String.replaceAll(String.java:2223)
at sa.edu.kaust.Annotations.getAnnotations(Annotations.java:68)
at sa.edu.kaust.Main.runTool(Main.java:302)
at sa.edu.kaust.Main.run(Main.java:225)
at sa.edu.kaust.Main.main(Main.java:343)
My computer has met the hardware requirements. I can't understand this error.
Docker build fails
root@ar-builder:/home/debian/phenomenet-vp# docker build -t phenomenet-vp . [+] Building 0.8s (8/11) => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 779B 0.0s => [internal] load .dockerignore 0.0s => => transferring context: 67B 0.0s => [internal] load metadata for docker.io/omahoco1/alpine-java-python:latest 0.3s => [1/8] FROM docker.io/omahoco1/alpine-java-python@sha256:fc6c9b9f8742e5da5aa1206abbdd3825fbfb52473adfddfb985822e3d89ebb87 0.0s => [internal] load build context 0.0s => => transferring context: 4.20kB 0.0s => CACHED [2/8] RUN mkdir /tmp/phenomenet-vp 0.0s => CACHED [3/8] WORKDIR /tmp/phenomenet-vp 0.0s => ERROR [4/8] RUN curl -L https://downloads.gradle.org/distributions/gradle-4.10.2-bin.zip -o gradle.zip && mkdir /opt/gradle && unzip -d /opt/gradle gradle.zip && rm -rf * 0.5s ------ > [4/8] RUN curl -L https://downloads.gradle.org/distributions/gradle-4.10.2-bin.zip -o gradle.zip && mkdir /opt/gradle && unzip -d /opt/gradle gradle.zip && rm -rf *: #0 0.287 /bin/sh: curl: not found ------ Dockerfile:11 -------------------- 10 | # Install gradle 11 | >>> RUN curl -L https://downloads.gradle.org/distributions/gradle-4.10.2-bin.zip -o gradle.zip && \ 12 | >>> mkdir /opt/gradle && \ 13 | >>> unzip -d /opt/gradle gradle.zip && \ 14 | >>> rm -rf * 15 | -------------------- ERROR: failed to solve: process "/bin/sh -c curl -L https://downloads.gradle.org/distributions/gradle-4.10.2-bin.zip -o gradle.zip && mkdir /opt/gradle && unzip -d /opt/gradle gradle.zip && rm -rf *" did not complete successfully: exit code: 127 root@ar-builder:/home/debian/phenomenet-vp#
Getting an error using NA12878 1000 genomes vcf file. Tool works using the example file provided data/Miller.vcf but throws error otherwise. Any suggestions?
ftp://ftp.1000genomes.ebi.ac.uk/vol1/ftp/data_collections/1000_genomes_project/release/20190312_biallelic_SNV_and_INDEL/ALL.chr20.shapeit2_integrated_snvindels_v2a_27022019.GRCh38.phased.vcf.gz
---------------------------------------------------------------
Inferences
---------------------------------------------------------------
Inferring ancestors...
Checking Treatment coherency
Incoherencies : 0
Inferring descendants...
Checking Treatment coherency
Incoherencies : 0
Inferring Conceptual Leaves...
---------------------------------------------------------------
Engine initialized
================================================================
Feb 10, 2020 7:45:52 PM sa.edu.kaust.Main runTool
INFO: Computing similarities
---------------------------------------------------------------
computing IC ResnikIC
---------------------------------------------------------------
Class name slib.sml.sm.core.metrics.ic.annot.IC_annot_resnik_1995_Normalized
Checking null or infinite in the ICs computed
ic ResnikIC computed ---------------------------------------------------------------
Feb 10, 2020 7:46:17 PM sa.edu.kaust.Main runTool
INFO: Getting top level phenotypes
Feb 10, 2020 7:46:17 PM sa.edu.kaust.Main runTool
INFO: Starting annotation
java.lang.NumberFormatException: For input string: "NA"
at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:2043)
at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)
at java.lang.Double.parseDouble(Double.java:538)
at sa.edu.kaust.Annotations$1.apply(Annotations.java:206)
at sa.edu.kaust.Annotations$1.apply(Annotations.java:121)
at java.util.Arrays.lambda$parallelSetAll$0(Arrays.java:4718)
at java.util.stream.ForEachOps$ForEachOp$OfInt.accept(ForEachOps.java:204)
at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110)
at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:290)
at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
The website to store CADD database file was obsolete.
Could you please update it or tell me which file should be downloaded in the CADD official web.
I was running the samples after installing as described in the GitHub readme and I had this error:
SEVERE: java.lang.ArrayIndexOutOfBoundsException: 9
at sa.edu.kaust.Classification.toCSV(Classification.java:577)
at sa.edu.kaust.Main.runTool(Main.java:227)
at sa.edu.kaust.Main.run(Main.java:152)
at sa.edu.kaust.Main.main(Main.java:258)
It was caused by a carriage return before the last column on some of the lines in the processed Pfeiffer.vcf.out file. I found an error was caused because of a used "\r" instead of "\r" in Annotations.java.
Then compiling I had some problems because of missing dependencies, I found some that compiled and added to a patch.
Also, the Gradle versioning is not coherent with the tag in Git, v2.0 has v1.1 in build.gradle - I did not change it and it is not in the patch file attached.
Best wishes,
Victor
Patch:
patch_final.txt
I also found other cases with \, I am not sure if it might be affecting the results somewhere else.
Thank you.
Hi,
was trying to run the tool with the command
/bin/phenomenet-vp -f Swena_das_hardfiltered.vcf -p HP:0000821,HP:0000851,HP:0002925,HP:0005990,HP:0011791 -h -of
but the error is showing the following
Missing dependencies required to run the tool. Please refer to the README file for more information.
I have tried to install that "pip install -r requirements.txt" package but its not working. Need help on this package installation is that a separate package or you have made any specific requirements.
Hi there,
I am getting this msg while running phenomenet i have all files in the requirments page however when i try to run it, i get this msg. The file its complaining about its already there?
user@pc:/media/daruma/sea6/pheno$ /media/daruma/sea6/pheno/phenomenet-vp-2.1/bin/phenomenet-vp -f /media/daruma/sea6/pheno/TSVC_variants_IonXpress_003.vcf --outfile 26682.vcf
Aug 02, 2018 1:00:34 PM sa.edu.kaust.Main runTool
INFO: Initializing the model
/media/daruma/sea6/pheno/phenomenet-vp-2.1/
Aug 02, 2018 1:00:34 PM sa.edu.kaust.Main runTool
SEVERE: java.nio.file.NoSuchFileException: data/diseasephenotypes.txt
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
at java.nio.file.Files.newByteChannel(Files.java:361)
at java.nio.file.Files.newByteChannel(Files.java:407)
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384)
at java.nio.file.Files.newInputStream(Files.java:152)
at java.nio.file.Files.newBufferedReader(Files.java:2784)
at java.nio.file.Files.newBufferedReader(Files.java:2816)
at sa.edu.kaust.Main.loadDiseasePhenotypes(Main.java:100)
at sa.edu.kaust.Main.runTool(Main.java:213)
at sa.edu.kaust.Main.run(Main.java:160)
at sa.edu.kaust.Main.main(Main.java:275)
Hi,
I had some problems unzipping the file whole_genome_SNVs_inclAnno.tsv.gz to be used in another step in the generate.sh file. I suggest skipping the intermediary file using a pipe.
generate.sh lines 1, 2:
Building dockerfile failed due to dependencies.
curl and gcc are absent.
Hello,
The tool performs pretty well (except for " SEVERE: java.nio.file.NoSuchFileException: data/omim_mode.txt" error, which i solved looking at the other issues) but when "Finished Annotation" appears it stuks for more then 12h on the same sample without doing nothing (a java process with 40gb busy RAM keeps going for all the time).
cmd line is:
for F in *.vcf ; do phenomenet-vp -f "$F" -p HP:"1",etc... -i recessive -d -og --outfile "$F".out ; done
System is: Ubuntu 14, Java 8, Python 2.7, 48g RAM
when i kill the process i can see that it generated populated .digenic .top and . res files and an empty .oligogenic file.
Is this normal?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.