Giter Site home page Giter Site logo

datalad-datasets / human-connectome-project-openaccess Goto Github PK

View Code? Open in Web Editor NEW
36.0 36.0 6.0 263 KB

WU-Minn HCP1200 Data: 3T/7T MR scans from young healthy adults twins and non-twin siblings (ages 22-35) [T1w, T2w, resting-state and task fMRI, high angular resolution dMRI]

Home Page: https://db.humanconnectome.org/data/projects/HCP_1200

human-connectome-project-openaccess's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

human-connectome-project-openaccess's Issues

missing data in annex?

Hi -- thank you for this awesome resource! I've downloaded a bunch of the data successfully, but I'm running into one participant that may not be in the annex (sub-171532)? I get the same errors below for all of the GAMBLING and SOCIAL data. Thanks for any help!

tug87422@cla18994 /data/projects/human-connectome-project-openaccess $ datalad get HCP1200/171532/MNINonLinear/Results/tfMRI_GAMBLING_LR/tfMRI_GAMBLING_LR.nii.gz
[ERROR  ] Failed to clone from any candidate source URL. Encountered errors per each url were:                                                                              
| - http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear
  CommandError: 'git clone --progress http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear' failed with exitcode 128 [err: 'Cloning into '/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear'...
fatal: repository 'http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/' not found']
- http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git
  CommandError: 'git clone --progress http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear' failed with exitcode 128 [err: 'Cloning into '/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear'...
fatal: repository 'http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git/' not found'] [install(/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear)] 
install(error): /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear (dataset) [Failed to clone from any candidate source URL. Encountered errors per each url were:
- http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear
  CommandError: 'git clone --progress http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear' failed with exitcode 128 [err: 'Cloning into '/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear'...
fatal: repository 'http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/' not found']
- http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git
  CommandError: 'git clone --progress http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear' failed with exitcode 128 [err: 'Cloning into '/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear'...
fatal: repository 'http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git/' not found']]
get(impossible): /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear/Results/tfMRI_GAMBLING_LR/tfMRI_GAMBLING_LR.nii.gz [path does not exist]
[ERROR  ] Failed to clone from any candidate source URL. Encountered errors per each url were:                                                                              
| - http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear
  CommandError: 'git clone --progress http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear' failed with exitcode 128 [err: 'Cloning into '/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear'...
fatal: repository 'http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/' not found']
- http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git
  CommandError: 'git clone --progress http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear' failed with exitcode 128 [err: 'Cloning into '/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear'...
fatal: repository 'http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git/' not found'] [install(/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear)] 
install(error): /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear (dataset) [Failed to clone from any candidate source URL. Encountered errors per each url were:
- http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear
  CommandError: 'git clone --progress http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear' failed with exitcode 128 [err: 'Cloning into '/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear'...
fatal: repository 'http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/' not found']
- http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git
  CommandError: 'git clone --progress http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear' failed with exitcode 128 [err: 'Cloning into '/data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear'...
fatal: repository 'http://store.datalad.org/ee9/c7ade-2870-11ea-9248-0025904abcb0/MNINonLinear/.git/' not found']]
get(impossible): /data/projects/human-connectome-project-openaccess/HCP1200/171532/MNINonLinear/Results/tfMRI_GAMBLING_LR/tfMRI_GAMBLING_LR.nii.gz [path does not exist]
action summary:
  get (impossible: 2)
  install (error: 2)

datalad get keeps asking keyring password

When I run datalad get for the first time, it asks me to enter the AWS id/key as well as my python-keyring password.

brlife@wrangler2:/mnt/datalad/datasets.datalad.org/hcp-openaccess/HCP1200/100206/T1w$ datalad get ribbon.nii.gz
You need to authenticate with 'hcp-s3' credentials. https://wiki.humanconnectome.org/display/PublicData/Connecting+to+Connectome+Data+via+AWS provides information on how to gain access
key_id: XXXXXXXXXXXXXXX

Please set a password for your new keyring: 
Please confirm the password: 
You need to authenticate with 'hcp-s3' credentials. https://wiki.humanconnectome.org/display/PublicData/Connecting+to+Connectome+Data+via+AWS provides information on how to gain access
secret_id: 
get(ok): ribbon.nii.gz (file) [from datalad...]   

When I download another file, it doesn't ask me for AWS id/key, but it asks me for the python keyring password.

brlife@wrangler2:/mnt/datalad/datasets.datalad.org/hcp-openaccess/HCP1200/100206/T1w$ datalad get T2w_acpc_dc.nii.gz
Please enter password for encrypted keyring: 
get(ok): T2w_acpc_dc.nii.gz (file) [from datalad...]     

Basically I just need to enter the keyring password everytime I run datalad get. I am running datalad get as part of backend service for brainlife (to cache the data on behalf of all of our users) so I need to provide the password via non-interactive method. Is there a way to do that?

I see this in the README.

you will be asked to supply your AWS credentials the first time you use datalad get to retrieve file content of your choice from the HCP Open Access dataset. You should only need to provide credentials once, and all subsequent datalad get commands will retrieve data without asking them again.

Maybe there is a way to disable passphrase for keyring? The server that I am running datalad get can only be accessed by our backend service, so there is "some" level of trust so that not having keyring password is probably not the worst thing that I am doing.

Do you have any suggestion?

Make a release and thus Zenodo record?

I think this DataLad dataset is worth its own citeable reference. I have already tuned on integration with zenodo for it, but did not dare to mint any release. I guess version tag could be either completely unrelated from HCP (e.g. either 1.0.0 or date based e.g. 0.20210105.0, or be based on the release it incorporates , e.g. 1200.0.0). WDYT?

Ask for the preprocessed 7T dataset

Dear experts,

There's only unpreprocessed 7T dataset at datalad-datasets/hcp-movies and datalad-datasets/human-connectome-project-openaccess, is it possible to get processed 7T dataset with datalad?

Best wishes,
Peng

FOI: some stats after extracting metadata (no content metadata)

D'oh (for myself): this repo (dataset) is pure git, not git-annex

extracted all metadata (without any content - just base core, datalad 0.12.5.dev19), and it went into git.

FWIW some stats

  • totals: 5.8GB of .datalad/metadata
  • .git/objects is 1.6GB
  • majority is ds-* files (~4.9GB). They didn't get .xz'ed
  • majority of ds-* size seems to be due to annex.filename which is a list of lists of filenames.
  • those ds-* compress ~8 times if we do xz them

Rename to hcp-openaccess

To stay consistent with the bucket name, and be considerably shorter. HCP is a known acronym in the field anyways.
I am not sure if -openaccess is if any value and actually not misleading since according to osi definition of open, it is not. But for consistency sake, could be kept

HCP1200 SmoothedMyelin*gii

Hi all,

this is a directory request in datalad-datasets :)

Could you please make a dir for the smoothed myelin maps from HCP1200 for me?

This is what I would like to get per subject:

<SUBJID>/MNINonLinear/fsaverage_LR32k/<SUBJID>.L.SmoothedMyelinMap.32k_fs_LR.func.gii
<SUBJID>/MNINonLinear/fsaverage_LR32k/<SUBJID>.R.SmoothedMyelinMap.32k_fs_LR.func.gii

Alternatively, feel free to point me on how to do it by myself.

Thanks in advance,
Şeyma.

Datalad dataset of preprocessed ER and ToM

Hello you great Datalad-magicians ✨,

I would very much appreciate one Datalad dataset each of the preprocessed Emotion Recognition & Theory of Mind/Social Cognition task data from the WU-Minn HCP1200 dataset.

The paths would be the following:
Directories for Emotion Recognition

  • /MNINonLinear/Results/tfMRI_EMOTION_/EVs/.txt
  • /MNINonLinear/Results/tfMRI_EMOTION_*/Movement_Regressors.txt
  • /MNINonLinear/Results/tfMRI_EMOTION_*/tfMRI_EMOTION_??.nii.gz
  • /MNINonLinear/Results/tfMRI_EMOTION_*/tfMRI_EMOTION_??_SBRef.nii.gz

Directories for Theory of Mind / Social Cognition

  • /MNINonLinear/Results/tfMRI_SOCIAL_/EVs/.txt
  • /MNINonLinear/Results/tfMRI_SOCIAL_*/Movement_Regressors.txt
  • /MNINonLinear/Results/tfMRI_SOCIAL_*/tfMRI_SOCIAL_??.nii.gz
  • /MNINonLinear/Results/tfMRI_SOCIAL_*/tfMRI_SOCIAL_??_SBRef.nii.gz

Thank you very much 💗

Missing data from 3 subjects

Hi! Similar to #8, I found that the denoised volumetric rsfMRI files are missing for 3 subjects here, which are avaiable from ConnectomeDB directly. These are:

193239/MNINonLinear/Results/rfMRI_REST2_RL/rfMRI_REST2_LR_hp2000_clean.nii.gz
467351/MNINonLinear/Results/rfMRI_REST1_RL/rfMRI_REST1_RL_hp2000_clean.nii.gz
705341/MNINonLinear/Results/rfMRI_REST1_LR/rfMRI_REST1_LR_hp2000_clean.nii.gz

Thanks!

HCP 7T data now available on S3

The full up-to-date 7T data was recently added to their S3 bucket:

News from the Human Connectome Project (HCP)
November 17, 2021
We are pleased to announce the availability of all 7T data for 184 HCP-Young Adult (HCP-YA) subjects on Amazon Web Services (AWS) Simple Storage Service (S3). HCP has continued our partnership with the AWS Open Data Sponsorship Program to offer storage and access to all HCP imaging data on Amazon S3.

just wanted to note this update somewhere in case not everyone knows - it would be wonderful to access via Datalad with the rest of the dataset in the future!

Missing data 2 subjects (200614, 205119: MNINonLinear/Results/rfMRI_REST1_LR)

It seems that the rfMRI_REST1_LR.nii.gz files (in the /human-connectome-project-openaccess/HCP1200/*/MNINonLinear/Results/rfMRI_REST1_LR folder) are missing for 2 subjects in my sample. According to the HCP website, this data should be available for these subjects. Is there a way to add this data? Subject id's are 200614 and 205119. Thanks a lot!

datalad get "Unable to access these remotes"

Hello,

I can't access the data with datalad get. I get either a "Failed to download from any of 2 locations" or "Unable to access these remotes" message.

I'm on Ubuntu and first tried the datalab version available with apt. Here is my output and my datalad wtf

datalad-wtf-ubuntu-prompt.txt
datalad-wtf-ubuntu.txt

I also tried to install datalad in a virtual environment to have the latest version of datalad.

datalad-wtf-venv-prompt.txt
datalad-wtf-venv.txt

Since datalad get never asked for my credentials, I also tried to put my them directly in the prompt but I get the same messages

DATALAD_hcp_s3_key_id=<key> DATALAD_hcp_s3_secret_id=<secret> datalad get ...

Potentially unavailable files after November 2021 update

I have been updating the subsampled datasets that derive from this large dataset and can also be found under this organization. This brought to light that there are a number of files in the dataset that can't be retrieved. This number is quite small compared to the overall number of files, but worthy of investigating. We should make sure that these files indeed were removed from the bucket and remove them from the datasets too, or, if they actually are available, figure out what went wrong with adding their URLs.

There is a single one in datalad-datasets/hcp_smoothedmyelin#2 (this is a newly added file):

549757/MNINonLinear/fsaverage_LR32k/549757.R.SmoothedMyelinMap.32k_fs_LR.func.gii

In https://github.com/datalad-datasets/hcp-functional-connectivity there seem to be some systematic failures:

118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/Movement_AbsoluteRMS.txt
118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/Movement_AbsoluteRMS_mean.txt 
118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/Movement_Regressors.txt 
118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/Movement_Regressors_dt.txt 
118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/Movement_RelativeRMS.txt 
118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/Movement_RelativeRMS_mean.txt 
118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/rfMRI_REST1_7T_PA_Atlas_stats.txt 
118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/rfMRI_REST1_7T_PA_CSF.txt 
118225/MNINonLinear/Results/rfMRI_REST1_7T_PA/rfMRI_REST1_7T_PA_WM.txt
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/Movement_AbsoluteRMS.txt 
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/Movement_AbsoluteRMS_mean.txt
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/Movement_Regressors.txt
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/Movement_Regressors_dt.txt 
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/Movement_RelativeRMS.txt
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/Movement_RelativeRMS_mean.txt
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/rfMRI_REST2_7T_AP_Atlas_stats.txt 
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/rfMRI_REST2_7T_AP_CSF.txt
118225/MNINonLinear/Results/rfMRI_REST2_7T_AP/rfMRI_REST2_7T_AP_WM.txt
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/Movement_AbsoluteRMS.txt
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/Movement_AbsoluteRMS_mean.txt 
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/Movement_Regressors.txt
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/Movement_Regressors_dt.txt
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/Movement_RelativeRMS.txt 
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/Movement_RelativeRMS_mean.txt
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/rfMRI_REST3_7T_PA_Atlas_stats.txt 
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/rfMRI_REST3_7T_PA_CSF.txt
118225/MNINonLinear/Results/rfMRI_REST3_7T_PA/rfMRI_REST3_7T_PA_WM.txt 
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/Movement_AbsoluteRMS.txt 
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/Movement_AbsoluteRMS_mean.txt 
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/Movement_Regressors.txt
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/Movement_Regressors_dt.txt
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/Movement_RelativeRMS.txt 
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/Movement_RelativeRMS_mean.txt 
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/rfMRI_REST4_7T_AP_Atlas_stats.txt 
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/rfMRI_REST4_7T_AP_CSF.txt
118225/MNINonLinear/Results/rfMRI_REST4_7T_AP/rfMRI_REST4_7T_AP_WM.txt 

I see this pattern of files without a known copy for a few, but not all subjects in the dataset. A few example subjects where the failure occurs are 118225 and 782561. The subjects where I don't see it do not seem to contain these files in the first place. One example is subject 987074. Does any of this ring a bell? Ping for awareness @mih @loj

HCPS1200 surface based myelin maps

Hello hello,

would it be possible to get the hcp myelin maps in a new repo?

It should be coded something similar to the line below in HCP:

//MNINonLinear/fsaverage_LR32k/MyelinMap_BC.32k_fs_LR.dscalar.nii

Thanks in advance!
Şeyma.

Few participant datasets broken

Previously undiscovered download error for 7 participants will require an update to the dataset for those. Affected:

  • 675661
  • 680250
  • 680452
  • 683256
  • 685058
  • 686969
  • 687163

Incomplete data for 3 subjects

Specifically 150928, 186444, 188347. These subjects each have only one subdataset (MNINonLinear), but T1w and unprocessed should also be present (as checked with a local datalad ls -Lr).
I'm currently regenerating the tables and hope to add the missing subjects soon.

Link training materials

There is a plan to create slides for working with the HCP data on HPC systems. Once they exist, we should link them here.

Failed to clone from all attempted sources

I might be missing some step, but I just datalad install-ed this repo and tried to datalad installl / get 1 subject. I couldn't get very far.. as I am seeing these error messages.

brlife@wrangler:/mnt/datalad/human-connectome-project-openaccess$ datalad get -r HCP1200
[INFO   ] Installing <Dataset path=/mnt/datalad/human-connectome-project-openaccess> underneath /mnt/datalad/human-connectome-project-openaccess/HCP1200 recursively 
[ERROR  ] Failed to clone from all attempted sources: ['https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100206', 'https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100206/.git', 'ria+http://store.datalad.org#346a3ae0-2c2e-11ea-a27d-002590496000'] [install(/mnt/datalad/human-connectome-project-openaccess/HCP1200/100206)] 
install(error): /mnt/datalad/human-connectome-project-openaccess/HCP1200/100206 (dataset) [Failed to clone from all attempted sources: ['https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100206', 'https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100206/.git', 'ria+http://store.datalad.org#346a3ae0-2c2e-11ea-a27d-002590496000']]
[ERROR  ] Failed to clone from all attempted sources: ['https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100307', 'https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100307/.git', 'ria+http://store.datalad.org#a51b84fc-2c2d-11ea-9359-0025904abcb0'] [install(/mnt/datalad/human-connectome-project-openaccess/HCP1200/100307)] 
install(error): /mnt/datalad/human-connectome-project-openaccess/HCP1200/100307 (dataset) [Failed to clone from all attempted sources: ['https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100307', 'https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100307/.git', 'ria+http://store.datalad.org#a51b84fc-2c2d-11ea-9359-0025904abcb0']]
[ERROR  ] Failed to clone from all attempted sources: ['https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100408', 'https://github.com/datalad-datasets/human-connectome-project-openaccess.git/HCP1200/100408/.git', 'ria+http://store.datalad.org#d3fa72e4-2c2b-11ea-948f-0025904abcb0'] [install(/mnt/datalad/human-connectome-project-openaccess/HCP1200/100408)] 
...

Do I need to install the AWS credential manually?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.