Version: 1.1.3
Type: BUG
When I use unconditional to interrupt training, loading the model again will cause the following error.
Traceback (most recent call last): File "D:\Integrated-Design-Diffusion-Model\tools\train.py", line 408, in <module> main(args) File "D:\Integrated-Design-Diffusion-Model\tools\train.py", line 293, in main train(args=args) File "D:\Integrated-Design-Diffusion-Model\tools\train.py", line 156, in train load_ckpt(ckpt_path=pretrain_path, model=model, device=device, is_pretrain=pretrain, load_model_ckpt(model=model, model_ckpt=ckpt_model, is_train=is_train, is_pretrain=is_pretrain, File "D:\Integrated-Design-Diffusion-Model\utils\checkpoint.py", line 115, in load_model_ckpt model_weights_dict = {k: v for k, v in model_weights_dict.items() if np.shape(model_dict[k]) == np.shape(v)} File "D:\Integrated-Design-Diffusion-Model\utils\checkpoint.py", line 115, in <dictcomp> model_weights_dict = {k: v for k, v in model_weights_dict.items() if np.shape(model_dict[k]) == np.shape(v)} KeyError: 'label_emb.weight'
The model incorrectly loaded the weights in the conditional mode. I think there is an error here. Should we add conditional parameter to the load_model_ckpt()
method in checkpoint.py?