Giter Site home page Giter Site logo

Comments (5)

JunweiLiang avatar JunweiLiang commented on May 26, 2024

Hi.
Yes. This repo use similar preprocess protocol as the Next model. For Social GAN, you can use the "obs_traj" and "pred_traj" in the preprocessed files (see here) since Social GAN only needs trajectory inputs. For Next-prediction, it needs more features other than the scene semantic features. Follow the preprocessing protocol for Next-prediction here. If you keep the training/testing split the same, you can compare them.

Sure! you can try adding more features to Multiverse and see whether it helps. But it will take more GPU memory and harder to train.

from multiverse.

Jacobieee avatar Jacobieee commented on May 26, 2024

@JunweiLiang Thanks for your reply!
Just to make sure if I understand it right. In preprocessing, the ActEv dataset and other features are processed and generated 3 .npz files in actev_preprocess directory. Since the Next model is trained in the same dataset, I only need to load the pre-trained model in the forking path dataset by running the multifuture inference here. Is that right?

And for Social GAN, I need to train it on ActEv dataset first, then test it on "next_x_v1_dataset_prepared_data/obs_data/traj_2.5fps" which only has testing dataset, to pass trajectory inputs into the model and start the evaluation. Does it also need to be run by multifuture inference?

from multiverse.

JunweiLiang avatar JunweiLiang commented on May 26, 2024

To run Next-prediction on the Multiverse dataset, you need to follow its preprocessing protocol to get all needed features into the testing npz files. For Social GAN, yes, you only need to run on the traj_2.5fps/ files. They are compatible with the SocialGAN repo.

from multiverse.

Jacobieee avatar Jacobieee commented on May 26, 2024

@JunweiLiang Hi, sorry for the delayed follow-up. I found an issue while testing sgan model, that it creates .pt files as checkpoints which is incompatible when running multifuture_inference.py. Could you show me how you evaluated that in your own evaluation?
And we don't need scene features like --scene_feat_path, and add --greedy instead, right?
Thanks a lot!

from multiverse.

JunweiLiang avatar JunweiLiang commented on May 26, 2024

For SGAN experiments, you should use their script evaluate_model.py. I think you need to modify it a bit to add an dataset_path argument so you can run it like this:

python sgan/scripts/evaluate_model.py --dataset_path traj_2.5fps/test/ --model_path sgan-20V-20/checkpoint_with_model.pt 

from multiverse.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.