Giter Site home page Giter Site logo

bio-ontology-research-group / phenomenet-vp Goto Github PK

View Code? Open in Web Editor NEW
34.0 8.0 13.0 6.68 MB

A phenotype-based tool for variant prioritization in WES and WGS data

Java 75.08% Python 16.31% Groovy 7.40% Dockerfile 0.98% Shell 0.23%
translationalscience variant-analysis variant-annotations prioritize-disease-variants rare-variants phenotypes vcf-files omim synthetic-genomes variants

phenomenet-vp's Introduction

PhenomeNet Variant Predictor (PVP) - User Guide

A phenotype-based tool to annotate and prioritize disease variants in WES and WGS data

This user guide have been tested on Ubuntu version 16.04.

For details regarding model training and evaluation, please refer to dev/ directory above.

Hardware requirements

  • At least 32 GB RAM.
  • At least 1TB free disk space to process and accommodate the necessary databases for annotation

Software requirements (for native installation)

  • Any Unix-based operating system
  • Java 8
  • Python 2.7 (as a system default version) and install the dependencies (for Python 2.7) with:
   pip install -r requirements.txt
  • Run python 2 for the script test.py (available above) to test the installation of the python dependencies. If the script fails, please try again to install the required dependencies ( using "pip2" instead of "pip", checking for permissions, or try the docker image instead).

Native Installation

  1. Download the distribution file phenomenet-vp-2.1.zip
  2. Download the data files phenomenet-vp-2.1-data.zip
  3. Extract the distribution files phenomenet-vp-2.1.zip
  4. Extract the data files data.tar.gz inside the directory phenomenet-vp-2.1
  5. cd phenomenet-vp-2.1
  6. Run the command: bin/phenomenet-vp to display help and parameters.

Database requirements

  1. Download CADD database file.
  2. Download and run the script generate.sh (Requires TABIX).
  3. Copy the generated files cadd.txt.gz and cadd.txt.gz.tbi to directory phenomenet-vp-2.1/data/db.
  4. Download DANN database file and its indexed file to directory phenomenet-vp-2.1/data/db.
  5. Rename the DANN files as dann.txt.gz and dann.txt.gz.tbi respectively.

Docker Container

  1. Install Docker
  2. Download the data files phenomenet-vp-2.1-data.zip and database requirements
  3. Build phenomenet-vp docker image:
   docker build -t phenomenet-vp .
  1. Run phenomenet
   docker run -v $(pwd)/data:/data phenomenet-vp -f data/Miller.vcf -o OMIM:263750 

Parameters

--file, -f
   Path to VCF file
--outfile, -of
   Path to results file
--inh, -i
   Mode of inheritance
   Default: unknown
--json, -j
   Path to PhenoTips JSON file containing phenotypes
--omim, -o
   OMIM ID
--phenotypes, -p
   List of phenotype ids separated by commas
--human, -h
   Propagate human disease phenotypes to genes only
   Default: false
--sp, -s
   Propagate mouse and fish disease phenotypes to genes only
   Default: false
--digenic, -d
   Rank digenic combinations
   Default: false
--trigenic, -t
   Rank trigenic combinations
   Default: false
--combination, -c
   Maximum Number of variant combinations to prioritize (for digenic and
   trigenic cases only)
   Default: 1000
 --ngenes, -n
   Number of genes in oligogenic combinations (more than three)
   Default: 4
--oligogenic, -og
   Rank oligogenic combinations
   Default: false
--python, -y
    Path to Python executable (ex. /usr/bin/python)
    Default: python

Usage:

To run the tool, the user needs to provide a VCF file along with either an OMIM ID of the disease or a list of phenotypes (HPO or MPO terms).

a) Prioritize disease-causing variants using an OMIM ID:

bin/phenomenet-vp -f data/Miller.vcf -o OMIM:263750

b) Prioritize digenic disease-causing variants using an OMIM ID, and gene-to-phenotype datta from human studies only:

bin/phenomenet-vp -f data/Miller.vcf -o OMIM:263750 --human --digenic

c) Prioritize disease-causing variants using a set of phenotypes, and recessive inheritance mode

bin/phenomenet-vp -f data/Miller.vcf -p HP:0000007,HP:0000028,HP:0000054,HP:0000077,HP:0000175 -i recessive 

The result file will be at the directory containg the input file. The output file has the same name as input file with .res extension. For digenic, trigenic or oligogenic prioritization, the result file will have .digenic, .trigenic, or .oligogenic extension repectivly.

Analysis of Rare Variants:

In order to effectively analysis rare variants, it is strongly recommended to filter the input VCF files by MAF prior to running phenomenet-vp on it. To do so, follow the instructions below:

a) Install VCFtools.

b) Run the following command using VCFtools on your input VCF file to filter out variants with MAF > 1%:

vcftools --vcf input_file.vcf --recode --max-maf 0.01 --out filtered

c) Run PVP on the output file filtered.recode.vcf generated from the command above.

PVP 1.0

The original random-forest-based PVP tool is available to download here along with its required data files here. The prepared set of exomes and genomes used for the analysis and results are provided here.

DeepPVP

The updated neural-network model, DeepPVP is available to download here along with its required data files here. The prepared set of exomes used for the analysis and comparative results are provided here. The comparison with PVP is based on PVP-1.1 available here along with its required data files here.

OligoPVP

OligoPVP is provided as part of DeepPVP tool using the parameters --digenic, --trigenicm and --oligogenic for ranking candidate disease-causing variant pairs and triples. Our prepared set of synthetic genomes digenic combinations are available here using data from the DIgenic diseases DAtabase (DIDA). The comparison results with other methods are also provided. Results were obtained using DeepPVP v2.0.

People

PVP is jointly developed by researchers at the University of Birmingham (Prof George Gkoutos and his team), University of Cambridge (Dr Paul Schofield and his team), and King Abdullah University of Science and Technology (Prof Vladimir Bajic, Robert Hoehndorf, and teams).

Publications

[1] Boudellioua I, Mahamad Razali RB, Kulmanov M, Hashish Y, Bajic VB, Goncalves-Serra E, Schoenmakers N, Gkoutos GV., Schofield PN., and Hoehndorf R. (2017) Semantic prioritization of novel causative genomic variants. PLOS Computational Biology. https://doi.org/10.1371/journal.pcbi.1005500

[2] Boudellioua I, Kulmanov M, Schofield PN., Gkoutos GV., and Hoehndorf R . (2018) OligoPVP: Phenotype-driven analysis of individual genomic information to prioritize oligogenic disease variants. Scientific Reports. https://doi.org/10.1038/s41598-018-32876-3

[3] Boudellioua I, Kulmanov M, Schofield PN., Gkoutos GV., and Hoehndorf R . (2019) DeepPVP: phenotype-based prioritization of causative variants using deep learning. BMC Bioinformatics. https://doi.org/10.1186/s12859-019-2633-8

License

Copyright (c) 2016-2018, King Abdullah University of Science and Technology
All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright
   notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
   notice, this list of conditions and the following disclaimer in the
   documentation and/or other materials provided with the distribution.
3. All advertising materials mentioning features or use of this software
   must display the following acknowledgment:
   This product includes software developed by the King Abdullah University
   of Science and Technology.
4. Neither the name of the King Abdullah University of Science and Technology
   nor the names of its contributors may be used to endorse or promote products
   derived from this software without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY KING ABDULLAH UNIVERSITY OF SCIENCE AND TECHNOLOGY
''AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, 
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL KING ABDULLAH UNIVERSITY OF SCIENCE AND TECHNOLOGY 
BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL 
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

phenomenet-vp's People

Contributors

coolmaksat avatar dependabot[bot] avatar imane-boudell avatar imene88 avatar leechuck avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

phenomenet-vp's Issues

Building docker fails

Docker build fails

root@ar-builder:/home/debian/phenomenet-vp# docker build -t phenomenet-vp .
[+] Building 0.8s (8/11)
 => [internal] load build definition from Dockerfile                                                                                                                                                                                                                                                                                                                                                         0.0s
 => => transferring dockerfile: 779B                                                                                                                                                                                                                                                                                                                                                                         0.0s
 => [internal] load .dockerignore                                                                                                                                                                                                                                                                                                                                                                            0.0s
 => => transferring context: 67B                                                                                                                                                                                                                                                                                                                                                                             0.0s
 => [internal] load metadata for docker.io/omahoco1/alpine-java-python:latest                                                                                                                                                                                                                                                                                                                                0.3s
 => [1/8] FROM docker.io/omahoco1/alpine-java-python@sha256:fc6c9b9f8742e5da5aa1206abbdd3825fbfb52473adfddfb985822e3d89ebb87                                                                                                                                                                                                                                                                                 0.0s
 => [internal] load build context                                                                                                                                                                                                                                                                                                                                                                            0.0s
 => => transferring context: 4.20kB                                                                                                                                                                                                                                                                                                                                                                          0.0s
 => CACHED [2/8] RUN mkdir /tmp/phenomenet-vp                                                                                                                                                                                                                                                                                                                                                                0.0s
 => CACHED [3/8] WORKDIR /tmp/phenomenet-vp                                                                                                                                                                                                                                                                                                                                                                  0.0s
 => ERROR [4/8] RUN curl -L https://downloads.gradle.org/distributions/gradle-4.10.2-bin.zip -o gradle.zip &&   mkdir /opt/gradle &&   unzip -d /opt/gradle gradle.zip &&   rm -rf *                                                                                                                                                                                                                         0.5s
------
 > [4/8] RUN curl -L https://downloads.gradle.org/distributions/gradle-4.10.2-bin.zip -o gradle.zip &&   mkdir /opt/gradle &&   unzip -d /opt/gradle gradle.zip &&   rm -rf *:
#0 0.287 /bin/sh: curl: not found
------
Dockerfile:11
--------------------
  10 |     # Install gradle
  11 | >>> RUN curl -L https://downloads.gradle.org/distributions/gradle-4.10.2-bin.zip -o gradle.zip && \
  12 | >>>   mkdir /opt/gradle && \
  13 | >>>   unzip -d /opt/gradle gradle.zip && \
  14 | >>>   rm -rf *
  15 |
--------------------
ERROR: failed to solve: process "/bin/sh -c curl -L https://downloads.gradle.org/distributions/gradle-4.10.2-bin.zip -o gradle.zip &&   mkdir /opt/gradle &&   unzip -d /opt/gradle gradle.zip &&   rm -rf *" did not complete successfully: exit code: 127
root@ar-builder:/home/debian/phenomenet-vp#

Stuck when almost done: it blocks when "Finished Annotation" appears

Hello,

The tool performs pretty well (except for " SEVERE: java.nio.file.NoSuchFileException: data/omim_mode.txt" error, which i solved looking at the other issues) but when "Finished Annotation" appears it stuks for more then 12h on the same sample without doing nothing (a java process with 40gb busy RAM keeps going for all the time).
cmd line is:
for F in *.vcf ; do phenomenet-vp -f "$F" -p HP:"1",etc... -i recessive -d -og --outfile "$F".out ; done
System is: Ubuntu 14, Java 8, Python 2.7, 48g RAM
when i kill the process i can see that it generated populated .digenic .top and . res files and an empty .oligogenic file.
Is this normal?

Error running the sample

I was running the samples after installing as described in the GitHub readme and I had this error:
SEVERE: java.lang.ArrayIndexOutOfBoundsException: 9
at sa.edu.kaust.Classification.toCSV(Classification.java:577)
at sa.edu.kaust.Main.runTool(Main.java:227)
at sa.edu.kaust.Main.run(Main.java:152)
at sa.edu.kaust.Main.main(Main.java:258)

It was caused by a carriage return before the last column on some of the lines in the processed Pfeiffer.vcf.out file. I found an error was caused because of a used "\r" instead of "\r" in Annotations.java.

Then compiling I had some problems because of missing dependencies, I found some that compiled and added to a patch.

Also, the Gradle versioning is not coherent with the tag in Git, v2.0 has v1.1 in build.gradle - I did not change it and it is not in the patch file attached.

Best wishes,
Victor

Patch:
patch_final.txt

I also found other cases with \, I am not sure if it might be affecting the results somewhere else.
Thank you.

Giving error related to addition tools

Hi,

was trying to run the tool with the command
/bin/phenomenet-vp -f Swena_das_hardfiltered.vcf -p HP:0000821,HP:0000851,HP:0002925,HP:0005990,HP:0011791 -h -of

but the error is showing the following

Missing dependencies required to run the tool. Please refer to the README file for more information.

I have tried to install that "pip install -r requirements.txt" package but its not working. Need help on this package installation is that a separate package or you have made any specific requirements.

SEVERE: java.nio.file.NoSuchFileException: data/diseasephenotypes.txt

Hi there,

I am getting this msg while running phenomenet i have all files in the requirments page however when i try to run it, i get this msg. The file its complaining about its already there?

user@pc:/media/daruma/sea6/pheno$ /media/daruma/sea6/pheno/phenomenet-vp-2.1/bin/phenomenet-vp -f /media/daruma/sea6/pheno/TSVC_variants_IonXpress_003.vcf --outfile 26682.vcf
Aug 02, 2018 1:00:34 PM sa.edu.kaust.Main runTool
INFO: Initializing the model
/media/daruma/sea6/pheno/phenomenet-vp-2.1/
Aug 02, 2018 1:00:34 PM sa.edu.kaust.Main runTool
SEVERE: java.nio.file.NoSuchFileException: data/diseasephenotypes.txt
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
	at java.nio.file.Files.newByteChannel(Files.java:361)
	at java.nio.file.Files.newByteChannel(Files.java:407)
	at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384)
	at java.nio.file.Files.newInputStream(Files.java:152)
	at java.nio.file.Files.newBufferedReader(Files.java:2784)
	at java.nio.file.Files.newBufferedReader(Files.java:2816)
	at sa.edu.kaust.Main.loadDiseasePhenotypes(Main.java:100)
	at sa.edu.kaust.Main.runTool(Main.java:213)
	at sa.edu.kaust.Main.run(Main.java:160)
	at sa.edu.kaust.Main.main(Main.java:275)

Get a OutOfMemoryError

Hi,

When I run the tool with command
bin/phenomenet-vp -f data/Miller.vcf -o OMIM:263750

I get this error the following

Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.regex.Pattern.compile(Pattern.java:1688)
at java.util.regex.Pattern.(Pattern.java:1351)
at java.util.regex.Pattern.compile(Pattern.java:1028)
at java.lang.String.replaceAll(String.java:2223)
at sa.edu.kaust.Annotations.getAnnotations(Annotations.java:68)
at sa.edu.kaust.Main.runTool(Main.java:302)
at sa.edu.kaust.Main.run(Main.java:225)
at sa.edu.kaust.Main.main(Main.java:343)

My computer has met the hardware requirements. I can't understand this error.

DeepPVP stalls on annotation step

When running DeepPVP the program is stalling at the annotation step. Any ideas what may be causing this? The program just continues to output the below listed message about 100 times and will run until walltime runs out. Thanks in advance!

INFO: Starting annotation java.lang.NumberFormatException: For input string: "NA" at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:2043) at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110) at java.lang.Double.parseDouble(Double.java:538) at sa.edu.kaust.Annotations$1.apply(Annotations.java:206) at sa.edu.kaust.Annotations$1.apply(Annotations.java:121) at java.util.Arrays.lambda$parallelSetAll$0(Arrays.java:4718) at java.util.stream.ForEachOps$ForEachOp$OfInt.accept(ForEachOps.java:205) at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) at java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:291) at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731) at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

Cannot find files in data/ folder

Hi, I'm trying to use phenomenet-vp, however I'm running into an error.

I extracted the folder data-v2.1.tar.gz inside the folder phenomenet-vp-2.1/, but when I try running the command:

/<path>/phenomenet-vp-2.1/bin/phenomenet-vp -f /<path>/phenomenet-vp-2.1/data/Miller.vcf -o OMIM:263750 --human --digenic

I get this error:

INFO: Initializing the model
Apr 29, 2022 12:13:57 PM sa.edu.kaust.Main runTool
SEVERE: java.nio.file.NoSuchFileException: data/omim_mode.txt

I also tried to move the file omim_mode.txt in the following folders:

/<path>/phenomenet-vp-2.1/data/omim_mode.txt
/<path>/phenomenet-vp-2.1/data-v2.1/omim_mode.txt
/<path>/phenomenet-vp-2.1/data-v2.1/data/omim_mode.txt
/<path>/phenomenet-vp-2.1/omim_mode.txt

However I get the same error.
What would be the precise folder architecture I should use?
Is there a way to specify the path to the data/ folder?

Thanks.

generate.sh: possibly improvement

Hi,
I had some problems unzipping the file whole_genome_SNVs_inclAnno.tsv.gz to be used in another step in the generate.sh file. I suggest skipping the intermediary file using a pipe.

generate.sh lines 1, 2:

  • gunzip whole_genome_SNVs_inclAnno.tsv.gz
  • awk -v OFS='\t' '{print $1,$2,$3,$5,$10,$11,$96,$116}' whole_genome_SNVs_inclAnno.tsv > cadd.txt
  • gunzip -c whole_genome_SNVs_inclAnno.tsv.gz | awk -v OFS='\t' '{print $1,$2,$3,$5,$10,$11,$96,$116}' > cadd.txt

Error on run using NA12878 1000 genomes vcf

Getting an error using NA12878 1000 genomes vcf file. Tool works using the example file provided data/Miller.vcf but throws error otherwise. Any suggestions?

ftp:/­/­ftp.­1000genomes.­ebi.­ac.­uk/­vol1/­ftp/­data_collections/­1000_genomes_project/­release/­20190312_biallelic_SNV_and_INDEL/­ALL.­chr20.­shapeit2_integrated_snvindels_v2a_27022019.­GRCh38.­phased.­vcf.­gz

---------------------------------------------------------------                                                                                                                                                    
Inferences                                                                                                                                                                                                         
---------------------------------------------------------------                                                                                                                                                    
Inferring ancestors...                                                                                                                                                                                             
Checking Treatment coherency                                                                                                                                                                                       
Incoherencies : 0                                                                                                                                                                                                  
Inferring descendants...                                                                                                                                                                                           
Checking Treatment coherency                                                                                                                                                                                       
Incoherencies : 0                                                                                                                                                                                                  
Inferring Conceptual Leaves...                                                                                                                                                                                     
---------------------------------------------------------------                                                                                                                                                    
Engine initialized                                                                                                                                                                                                 
================================================================                                                                                                                                                   
Feb 10, 2020 7:45:52 PM sa.edu.kaust.Main runTool                                                                                                                                                                  
INFO: Computing similarities                                                                                                                                                                                       
---------------------------------------------------------------                                                                                                                                                    
computing IC ResnikIC                                                                                                                                                                                              
---------------------------------------------------------------                                                                                                                                                    
Class name slib.sml.sm.core.metrics.ic.annot.IC_annot_resnik_1995_Normalized                                                                                                                                       
Checking null or infinite in the ICs computed                                                                                                                                                                      
ic ResnikIC computed  ---------------------------------------------------------------                                                                                                                                                    
Feb 10, 2020 7:46:17 PM sa.edu.kaust.Main runTool                                                                                                                                                                  
INFO: Getting top level phenotypes                                                                                                                                                                                 
Feb 10, 2020 7:46:17 PM sa.edu.kaust.Main runTool
INFO: Starting annotation
java.lang.NumberFormatException: For input string: "NA"
        at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:2043)
        at sun.misc.FloatingDecimal.parseDouble(FloatingDecimal.java:110)
        at java.lang.Double.parseDouble(Double.java:538)
        at sa.edu.kaust.Annotations$1.apply(Annotations.java:206)
        at sa.edu.kaust.Annotations$1.apply(Annotations.java:121)
        at java.util.Arrays.lambda$parallelSetAll$0(Arrays.java:4718)
        at java.util.stream.ForEachOps$ForEachOp$OfInt.accept(ForEachOps.java:204)
        at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110)
        at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
        at java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:290)
        at java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:731)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
        at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
        at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.