I saw someone in pyg slack channel advocating this repository. It looks quite attractive so I decided to give it a try.
I installed all dependencies and try to run python run.py trainer.gpus=0
, but the following error occurs
Would you like to take a look? Thanks!
[2022-05-08 16:44:10,655][src.train][INFO] - Instantiating datamodule <src.datamodules.ogbg_molhiv_datamodule.OGBGMolhiv>
[2022-05-08 16:44:10,949][numexpr.utils][INFO] - Note: detected 128 virtual cores but NumExpr set to maximum of 64, check "NUMEXPR_MAX_THREADS" environment variable.
[2022-05-08 16:44:10,949][numexpr.utils][INFO] - Note: NumExpr detected 128 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
[2022-05-08 16:44:10,949][numexpr.utils][INFO] - NumExpr defaulting to 8 threads.
[2022-05-08 16:44:15,456][src.train][INFO] - Instantiating model <src.models.ogbg_molhiv_model.OGBGMolhivModel>
[2022-05-08 16:44:16,045][torch.distributed.nn.jit.instantiator][INFO] - Created a temporary directory at /tmp/tmp9_3wucw6
[2022-05-08 16:44:16,046][torch.distributed.nn.jit.instantiator][INFO] - Writing /tmp/tmp9_3wucw6/_remote_module_non_sriptable.py
[2022-05-08 16:44:16,073][src.train][INFO] - Instantiating trainer <pytorch_lightning.Trainer>
[2022-05-08 16:44:16,080][pytorch_lightning.utilities.rank_zero][INFO] - GPU available: True, used: True
[2022-05-08 16:44:16,081][pytorch_lightning.utilities.rank_zero][INFO] - TPU available: False, using: 0 TPU cores
[2022-05-08 16:44:16,081][pytorch_lightning.utilities.rank_zero][INFO] - IPU available: False, using: 0 IPUs
[2022-05-08 16:44:16,081][pytorch_lightning.utilities.rank_zero][INFO] - HPU available: False, using: 0 HPUs
[2022-05-08 16:44:16,081][src.train][INFO] - Logging hyperparameters!
Traceback (most recent call last):
File "run.py", line 37, in main
return train(config)
File "/home/chen/fromosu/graph_classification/src/train.py", line 66, in train
utils.log_hyperparameters(
File "/home/chen/anaconda3/envs/graph_clf/lib/python3.8/site-packages/pytorch_lightning/utilities/rank_zero.py", line 32, in wrapped_fn
return fn(*args, **kwargs)
File "/home/chen/fromosu/graph_classification/src/utils/utils.py", line 153, in log_hyperparameters
trainer.logger.log_hyperparams(hparams)
AttributeError: 'NoneType' object has no attribute 'log_hyperparams'