axiezai / pipetography Goto Github PK
View Code? Open in Web Editor NEWNipype and tractography!
Home Page: https://axiezai.github.io/pipetography/
License: Apache License 2.0
Nipype and tractography!
Home Page: https://axiezai.github.io/pipetography/
License: Apache License 2.0
Change neurodocker link for mrtrix3
?
Insert the following to each notebook when recreating docs:
import warnings
warnings.filterwarnings('ignore')
This will clean up all the meaningless warning msgs in htmls.
If debug, save intermediate files, if not, delete.
Because whole-brain seeding may have advantages, and GMWMI generation may have disadvantages with data/image problems finding particular GM regions, make generating GMWMI=False by default. Whole-brain masks will be used for tractography seeding.
Install python and pip without conda for faster build time.
This produces asymmetric segmentation resulting in asymmetric connectomes (hemispheres)
pipetography/pipetography/nodes.py
Lines 550 to 572 in 1cfff62
interpolation
defaults to Linear
and is active even if args
has an interpolation input, and interpolation
is not hard coded to pre defined cases. So interpolation = genericLabel
should work, test it out by printing out the cmdline
first.
All commands should have adequate inputs now, but some are missing full functionality
Inside singularity
we have
To run interactively or as a job execution, you will need a few flags:
-e -c flags for a clean environnment upon container start
-B to bind your data & code directories.
Example: singularity shell -e -c -B <BIDS_DIR>:<SINGULARITY_BIDS_DIR> {Path to singularity .sif}
we should make sure this works and add a new section outside install
Right now an activation of the tracts
environment is required to run any code. This shouldn't be necessary in the container.
doc string tests and sample pipeline testing?
I was using pipetography.connectome that crashed on the SIFT2 node. The followings are the error message.
SLURM log file:
```220124-01:02:20,949` nipype.workflow INFO:
[Node] Finished "LinearRegistration", elapsed time 2063.624523s.
220124-01:17:09,232 nipype.workflow INFO:
[Node] Finished "NonLinearRegistration", elapsed time 1052.248168s.
Traceback (most recent call last):
File "/test_BIDS_dwi2run/code/pipetography/postprocess_bids.py", line 21, in
main()
File "/test_BIDS_dwi2run/code/pipetography/postprocess_bids.py", line 18, in main
connectomes.run_pipeline(parallel=4)
File "/opt/pipetography/pipetography/connectomes.py", line 128, in run_pipeline
self.workflow.run('MultiProc', plugin_args = {'n_procs': parallel})
File "/opt/nipype/nipype/pipeline/engine/workflows.py", line 638, in run
runner.run(execgraph, updatehash=updatehash, config=self.config)
File "/opt/nipype/nipype/pipeline/plugins/base.py", line 166, in run
self._clean_queue(jobid, graph, result=result)
File "/opt/nipype/nipype/pipeline/plugins/base.py", line 244, in _clean_queue
raise RuntimeError("".join(result["traceback"]))
RuntimeError: Traceback (most recent call last):
File "/opt/nipype/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
result["result"] = node.run(updatehash=updatehash)
File "/opt/nipype/nipype/pipeline/engine/nodes.py", line 524, in run
result = self._run_interface(execute=True)
File "/opt/nipype/nipype/pipeline/engine/nodes.py", line 642, in _run_interface
return self._run_command(execute)
File "/opt/nipype/nipype/pipeline/engine/nodes.py", line 751, in _run_command
f"Exception raised while executing Node {self.name}.\n\n{result.runtime.traceback}"
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node SIFT2.
RuntimeError: subprocess exited with code 1.```
And the crash file message:
`Node inputs:
act =
args =
environ = {}
fd_scale_gm =
force =
in_file =
in_fod =
nthreads =
out_file = sift2.txt
proc_mask =
Traceback:
Traceback (most recent call last):
File "/opt/nipype/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
result["result"] = node.run(updatehash=updatehash)
File "/opt/nipype/nipype/pipeline/engine/nodes.py", line 524, in run
result = self._run_interface(execute=True)
File "/opt/nipype/nipype/pipeline/engine/nodes.py", line 642, in _run_interface
return self._run_command(execute)
File "/opt/nipype/nipype/pipeline/engine/nodes.py", line 751, in _run_command
f"Exception raised while executing Node {self.name}.\n\n{result.runtime.traceback}"
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node SIFT2.
RuntimeError: subprocess exited with code 1.`
Then I pasted the comman.txt
in the SIFT2 working directory and ran the command script, the following error occurred:
Singularity> tcksift2 /test_BIDS_dwi2run/derivatives/streamlines/sub-UCSF002/ses-01/sub-UCSF002_ses-01_gmwmi2wm.tck /test_BIDS_dwi2run/code/pipetography/workingdir/connectomes/_session_id_01_subject_id_UCSF002/dwiFOD/wm.mif sift2.txt tcksift2: [100%] Creating homogeneous processing mask tcksift2: [100%] segmenting FODs tcksift2: [ERROR] invalid first line for key/value file "/test_BIDS_dwi2run/derivatives/streamlines/sub-UCSF002/ses-01/sub-UCSF002_ses-01_gmwmi2wm.tck" (expected "mrtrix tracks")
The file sub-UCSF002_ses-01_gmwmi2wm.tck exists, but somehow it showed this message.
Note that this same error occurred across different datasets.
eddy_openmp: error while loading shared libraries: libopenblas.so.0: cannot open shared object file: No such file or directory
dwifslpreproc:
dwifslpreproc: [ERROR] For debugging, inspect contents of scratch directory: /dataset/derivatives/pipetography/_session_id_01_subject_id_20/dwifslpreproc/dwifslpreproc-tmp-N0KGM9/
Return code: 127
List the pip installable libraries in settings.ini
missing filtering for full layout like in preprocessing
/opt/fsl-6.0.3/etc/fslconf/fslpython_install.sh: line 206: 1436 Killed
FSLDIR=$fsl_dir "${miniconda_bin_dir}/conda" env create -f "${script_dir}/fslpython_environment.yml" 2>> "${miniconda_install_log}"
1437 Done | ${script_dir}/progress.sh 133 ${quiet} >> "${miniconda_install_log}"
Failed to create FSL Python environment - see /tmp/fslpythonCHpj/fslpython_miniconda_installer.log for details
Something to replace nodes
like ...
pipetography.preprocessing.anat
pipetography.preprocessing.dwi
Maybe...
Change default freesurfer recon-all
to off, and use 5ttgen white matter mask for streamline generation option. But whole brain mask should apply as well.
pipetography/pipetography/connectomes.py
Lines 91 to 94 in ed1f7fb
Missing list.
Change the base_dir
default to a separate destination than datasink
destination so the final outputs and intermediate files are separate.
self.workflow = Workflow(
name=wf_name,
base_dir=os.path.join(self.bids_dir, "derivatives"),
)
pipetography/pipetography/pipeline.py
Lines 826 to 838 in ed1f7fb
if self.debug_mode:
self.workflow.config["execution"] = {
"use_relative_paths": "True",
"hash_method": "content",
"stop_on_first_crash": "True",
}
else:
self.workflow.config["execution"] = {
"use_relative_paths": "True",
"hash_method": "content",
"stop_on_first_crash": "True",
"remove_node_directories": "True",
}
Make this change for both pipeline
and connectomes
Since atlas registration occurs without changing labels, the rounding step is now redundant.
Change hippocampal subfield inputs from 6.0 to 7.1 format, or disable completely.
Possible improvements:
Create substitutions for datasink so the output paths are in consistent with BIDS
Use docker because we need
And these can't be pip/brew installed
Update index.py/html with docker hub pull, singularity build, and job execution instructions.
Also update markdown formatting so the github page actually has consistent structure.
Either as user input or a separate directory that's not part of the BIDS root.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.