Giter Site home page Giter Site logo

Comments (3)

fedshyvana avatar fedshyvana commented on August 17, 2024 2

@tmabraham Besides the hyperparamters you mentioned, the github was updated just a couple days ago, which gives you more options for tuning. e.g. you can now try a single-attention branch version of clam (--model_type clam_sb) which I recommend as it should run slightly faster while giving a favorable performance, and B can be set via --B, etc. Similarly, I have added the option to disable instance-level clustering via --no_inst_cluster. See options under main.py.
Other things you could try is adjusting model architecture, for the problems I tried, I had limited data for training so the model is fairly light-weight with a couple of layers and under 1M parameters (model_clam.py). You can try adding more stack of layers to/enlarge both the model itself as well as the attention module.
Also this repo is specifically for weakly-supervised problems where only the slide-level labels are provided. If your dataset gives additional labels at the pixel/ROI-level, you might want to think about whether there are other methods/modifications you can use that would take advantage of these additional labels to get better performance.

@jjhbw Yes, you can in principle try using different pre-trained models in extract_features.py w/ very slight changes to the code. Just make sure that if your embedding dimension changes, you would also need to make the corresponding changes to the model.

from clam.

jjhbw avatar jjhbw commented on August 17, 2024

I don't want to derail this discussion as it seems focussed on training hyperparameters, but i'm also eager to hear stories of people using different pretrained models for patch-level feature extraction (extract_features.py).

Intuitively, I would speculate that the ImageNet embeddings may leave some descriptive power on the table, as the ImageNet training images are very different from tissue slide images.

from clam.

tmabraham avatar tmabraham commented on August 17, 2024

@fedshyvana Is the single-attention branch version from the paper? Or is this a new version of the model?

I have an older version of the repository which I have already made changes to for my custom task, is there a way I can go through the code and disable the mutual exlcusivity assumption?

Also, I only have slide-level labels on the test sert for the task I am looking at right now.

from clam.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.