Giter Site home page Giter Site logo

mutex-watershed's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mutex-watershed's Issues

How to generate 'labels/gt_segmentation' from 'labels/membranes' ?

Hi, thank you for sharing. I tried to open isbi_train_volume.h5 and it contains ['affinities', 'labels', 'raw'], also the ['labels'] contains 'gt_segmentation' and 'membrabes'. In the previous issue, I learned that 'gt_segmentation' is a 3d segmentation derived from 'membranes'.

  • I wonder how did you get gt_segmentation from membrabes ? Is There any algorithm uesd in the process?

I would be very grateful if you could elaborate on this. Thanks!

(The first one in the following two pictures belongs to 'membranes', and the second one belongs to 'gt-segmentation')
image
image

Adaptation to EM samples with extracellular space

Hi,

I wonder whether there are ideas around to deal with 'non-dense' EM datasets. In the ISBI dataset (left) the objects are directly touching each other. In my dataset (right) there is quite a lot of extracellular space which separates the single objects.

ISBI2012vsECS

In the first (prototype) pipeline I could not see any option to mark a certain label (let's say 0) as background. This leads to a mws segmentation in which the background is split into multiple objects. The only way (I can think of) would require post-processing to discriminate between foreground and background. Anything (even just a pointer) how I can deal with such (partly) separated objects would be super helpful ...

Best wishes,
Eric

unable to install

when I run python setup.py install, I get:
running install
running bdist_egg
running egg_info
writing mutex_watershed.egg-info/PKG-INFO
writing dependency_links to mutex_watershed.egg-info/dependency_links.txt
writing requirements to mutex_watershed.egg-info/requires.txt
writing top-level names to mutex_watershed.egg-info/top_level.txt
reading manifest file 'mutex_watershed.egg-info/SOURCES.txt'
writing manifest file 'mutex_watershed.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_ext
gcc -pthread -B /opt/anaconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/opt/anaconda3/include/python3.7m -c /tmp/tmph205w3zf.cpp -o tmp/tmph205w3zf.o -std=c++14
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
gcc -pthread -B /opt/anaconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/opt/anaconda3/include/python3.7m -c /tmp/tmpmpj2kmxf.cpp -o tmp/tmpmpj2kmxf.o -fvisibility=hidden
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
building 'mutex_watershed' extension
creating build
creating build/temp.linux-x86_64-3.7
creating build/temp.linux-x86_64-3.7/src
gcc -pthread -B /opt/anaconda3/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/opt/anaconda3/include/python3.7m -I/home/jupyter/.local/include/python3.7m -I/opt/anaconda3/lib/python3.7/site-packages/numpy/core/include -I/opt/anaconda3/include -I/opt/anaconda3/Library/include -I/opt/anaconda3/include/python3.7m -c src/main.cpp -o build/temp.linux-x86_64-3.7/src/main.o -DVERSION_INFO="0.1.1" -std=c++14 -fvisibility=hidden
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++
src/main.cpp:7:29: fatal error: xtensor/xmath.hpp: No such file or directory
#include "xtensor/xmath.hpp"
^
compilation terminated.
error: command 'gcc' failed with exit status 1

How can I fix it?

How to create the affinity?

Hi, thank you for sharing. I wonder how do you create the ground truth affinity from the segmentation. I wrote a "segmentation_to_affinity" function on my own, and use this function to get the affinity of the segmentation groundtruth of isbi data (which you provides in this repo). But when I used the mutex watershed on these affinity graphs, I got some weird results. So I think I might write a "segmentation_to_affinity" function different from yours. Below shows the results of using the predicted affinities you provided and using my own affinities correspondingly. Any comments will be helpful! Thanks!

Screen Shot 2021-04-07 at 23 52 34
Screen Shot 2021-04-07 at 23 52 39

How to set the stride in compute_mws_segmentation

Inside the code of experiments/isbi/isbi_experiments.py, there is a setting for the stride, which is set as [1,10,10]. I am wondering what is this parameter related to, and how should I modify it when applying to different datasets. Thanks
''' if mws:
strides = np.array([1., 10., 10.])
print("Computing mutex watershed segmentation ...")
mws_seg, t_mws = mws_result(affs, offsets, strides, randomize_bounds=False)
print("... finished in %f s" % t_mws)
writeHDF5(mws_seg, os.path.join(result_folder, 'mws.h5'),
'data', compression='gzip')
segmentations.append(mws)
labels.append('MWS')
'''

how did you generate those data isbi_train_volume.h5 and isbi_test_volume.h5

Thanks for your sharing.
I tried to open isbi_train_volume.h5 and it contains ['affinities', 'labels', 'raw'], also the f['labels'] contains 'gt_segmentation' and 'membrabes'. Are those your pre-trained results or? Do the results seem that only the mutex-watershed cannot get good segmentation results?

How to substitute the input of mws with superpiexels obtained by watershed?

Hi, thanks for your sharing!
I'm wondering how can I use watershed to preprocess the 17-channel affinity map as mentioned in your another excellent work Plantseg. I noticed that in Plantseg a single channel affinity is employed without long range information. More specifically, is there a method in nifty(which unfortunately has no doc) or plantseg which aims at implementing this?
Looking forward to your reply!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.