Giter Site home page Giter Site logo

Comments (4)

Pseudomanifold avatar Pseudomanifold commented on May 18, 2024

Dear Zexian,

Thanks for reaching out and for your kind words!

Given that my custom dataset is a 2D dataset with non-fixed number of points (i.e., [n, d] where n is not fixed size), i use the torch_geometric to handle my data and batching. I follow the example in example/classification.py to build my forward function, however, the VietorisRipsComplex return an empty tensor after the make_tensor function. Are empty tensors expceted?

I'm not sure whether the VietorisRipsComplex is the right way to address this issue, since you are treating the input as kind of a point cloud (this is different from what we did in the TOGL paper). That being said, the complex should not return an empty tensor. Can you see what happens with a different dim parameter?

The second question is that, may not be relevant, but how are the PH diagrams calculated in pytorch-topological guaranteed to be differentiable compared to other backends, such as giotto-tda? What are the advantages that pytorch-topological provides in terms of differentiability and integration into deep NN layers?

So pytorch-topological wraps around some persistent homology calculations; the main idea behind differentiability is the one we mentioned in our paper on topological autoencoders. That is, we take the generators of the persistent homology features and link them back to the data points. I would say the main advantage of the package is that we aim to provide documentation and examples for everything, providing a simple way to try out topological ideas in your own work. Always open for feedback here—we want to get better!

Hope that helps :-)

from pytorch-topological.

zexhuang avatar zexhuang commented on May 18, 2024

Dear Bastian,

Thank you for replying. I am able to generate concrete outputs via the VietorisRipsComplex class by passing my custom dataset as a list of data points (i.e., [(n1, d1), (n2, d2), ...]) to the forward function. This solves my problem.

Differently, when passing the data in the batch one by one with a for loop, the forward function in VietorisRipsComplex gives empty results (i.e., (1, 0, 3), where 1 is the batch size, 0 is the db tuple, hence empty).

Anyway, i will close this issue soon. Thank you very much for the help!

Also, in your GFL paper, under Section 4.1, I quote "Specifically, the learnable vertex filter function, generically introduced in Definition 1, can be easily implemented by a neural network."

Here, the def 1 refers to the formal definition of a learnable filteration function parameterised by $\theta$.

If I understand this correctly, can I say that in the perspective of message-passing GNN, the persistent homology pipeline serves as a topological-aware readout function for graph classification task, and in the perspective of persistent homology, the parameterised NNs (such as GNN or MLP) become a learnable filtration function (as long as they are one-to-one mapping) for the upcoming persistent diagram or barcode?

Thanks,
Zexian.

from pytorch-topological.

Pseudomanifold avatar Pseudomanifold commented on May 18, 2024

I am able to generate concrete outputs via the VietorisRipsComplex class by passing my custom dataset as a list of data points (i.e., [(n1, d1), (n2, d2), ...]) to the forward function. This solves my problem.

Glad you found a workaround! But I think the original code that you used should also work. Could you try to isolate the problem and let me know what the shapes of the tensors involved in the operation are? Maybe it's a use case that I did not anticipate.

If I understand this correctly, can I say that in the perspective of message-passing GNN, the persistent homology pipeline serves as a topological-aware readout function for graph classification task, and in the perspective of persistent homology, the parameterised NNs (such as GNN or MLP) become a learnable filtration function (as long as they are one-to-one mapping) for the upcoming persistent diagram or barcode?

Yes, that's a great summary! The only deviation from this is that in TOGL, we don't stop at the READOUT stage but feed more information to subsequent layers of the graph.

Hope that helps!

from pytorch-topological.

Pseudomanifold avatar Pseudomanifold commented on May 18, 2024

Closing this due to lack of activity. Please reopen if you have more information about the involved shapes of tensors. I am very interested in tracking this problem.

from pytorch-topological.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.