msracver / relation-networks-for-object-detection Goto Github PK
View Code? Open in Web Editor NEWRelation Networks for Object Detection
License: MIT License
Relation Networks for Object Detection
License: MIT License
My batchsize(first_n) is 64. This brings a severe data imbalance problem.
Is it normal? Or I made some mistake?
I set the YAML as NUM_CLASSES: 189.
And I use tool https://raw.githubusercontent.com/withyou1771/Detectron_FocalLoss/master/tools/xml_to_json.py
to get COCO-like data from VOC-like data.
then run
python experiments/relation_rcnn/rcnn_end2end_train_test.py --cfg experiments/relation_rcnn/cfgs/epoch8.yaml
HERE the error info.
('lr', 0.0005, 'lr_epoch_diff', [5.33], 'lr_iters', [79950])
Epoch[0] Batch [100] Speed: 3.82 samples/sec Train-RPNAcc=0.964979, RPNLogLoss=0.105434, RPNL1Loss=0.111539, RCNNAcc=0.837523, RCNNLogLoss=1.683923, RCNNL1Loss=0.280678, NMSLoss_pos=0.052915, NMSLoss_neg=0.014347, NMSAcc_pos=0.000000, NMSAcc_neg=1.000000,
Epoch[0] Batch [200] Speed: 3.83 samples/sec Train-RPNAcc=0.971723, RPNLogLoss=0.088618, RPNL1Loss=0.084358, RCNNAcc=0.865341, RCNNLogLoss=1.241336, RCNNL1Loss=0.223327, NMSLoss_pos=0.056996, NMSLoss_neg=0.010787, NMSAcc_pos=0.000000, NMSAcc_neg=1.000000,
Error in CustomOp.forward: Traceback (most recent call last):
File "/home/hri/anaconda3/envs/relations/lib/python2.7/site-packages/mxnet/operator.py", line 987, in forward_entry
aux=tensors[4])
File "experiments/relation_rcnn/../../relation_rcnn/operator_py/box_annotator_ohem.py", line 36, in forward
per_roi_loss_cls = per_roi_loss_cls[np.arange(per_roi_loss_cls.shape[0], dtype='int'), labels.astype('int')]
IndexError: index 189 is out of bounds for axis 1 with size 189
terminate called after throwing an instance of 'dmlc::Error'
what(): [21:01:01] src/operator/custom/custom.cc:347: Check failed: reinterpret_cast<CustomOpFBFunc>( params.info->callbacks[kCustomOpForward])( ptrs.size(), const_cast<void**>(ptrs.data()), const_cast<int*>(tags.data()), reinterpret_cast<const int*>(req.data()), static_cast<int>(ctx.is_train), params.info->contexts[kCustomOpForward])
Stack trace returned 7 entries:
[bt] (0) /home/hri/anaconda3/envs/relations/lib/python2.7/site-packages/mxnet/libmxnet.so(+0x30756a) [0x7f50471e156a]
[bt] (1) /home/hri/anaconda3/envs/relations/lib/python2.7/site-packages/mxnet/libmxnet.so(+0x307b91) [0x7f50471e1b91]
[bt] (2) /home/hri/anaconda3/envs/relations/lib/python2.7/site-packages/mxnet/libmxnet.so(+0x4853f7) [0x7f504735f3f7]
[bt] (3) /home/hri/anaconda3/envs/relations/lib/python2.7/site-packages/mxnet/libmxnet.so(+0x46b128) [0x7f5047345128]
[bt] (4) /usr/lib/x86_64-linux-gnu/libstdc++.so.6(+0xb8c80) [0x7f50b2816c80]
[bt] (5) /lib/x86_64-linux-gnu/libpthread.so.0(+0x76ba) [0x7f50b903e6ba]
[bt] (6) /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7f50b866441d]
Any help? Thanks
Hello, I converted the scripts to python3. My mxnet is mxnet-cu101 version 1.6.0 for CUDA10.1. But when I ran the train_endtoend.py in ./relation_rcnn I am running into the following error after the config file is read:
'''
'symbol': 'resnet_v1_101_rcnn_dcn_attention_1024_pairwise_position_multi_head_16_learn_nms'}
loading annotations into memory...
Done (t=0.81s)
creating index...
index created!
num_images 7017
wrote gt roidb to ./cache/COCO_train_800x800_gt_roidb.pkl
filtered 0 roidb entries: 7017 -> 7017
[('data', (1, 3, 800, 800))]
[('data', (1, 3, 800, 800))]
[('label', (1, 30000)), ('bbox_target', (1, 48, 50, 50)), ('bbox_weight', (1, 48, 50, 50))]
providing maximum shape [('data', (1, 3, 800, 800)), ('gt_boxes', (1, 100, 5))] [('label', (1, 30000)), ('bbox_target', (1, 48, 50, 50)), ('bbox_weight', (1, 48, 50, 50))]
*********************Input Dictionary *********************
{'data': (1, 3, 800, 800), 'im_info': (1, 3), 'gt_boxes': (1, 13, 5), 'label': (1, 30000), 'bbox_target': (1, 48, 50, 50), 'bbox_weight': (1, 48, 50, 50)}
infer_shape error. Arguments:
data: (1, 3, 800, 800)
im_info: (1, 3)
gt_boxes: (1, 13, 5)
label: (1, 30000)
bbox_target: (1, 48, 50, 50)
bbox_weight: (1, 48, 50, 50)
Traceback (most recent call last):
File "rcnn_end2end_train_test.py", line 23, in
train_end2end.main()
File "../../relation_rcnn/train_end2end.py", line 188, in main
config.TRAIN.begin_epoch, config.TRAIN.end_epoch, config.TRAIN.lr, config.TRAIN.lr_step)
File "../../relation_rcnn/train_end2end.py", line 101, in train_net
sym_instance.infer_shape(data_shape_dict)
File "../../relation_rcnn/../lib/utils/symbol.py", line 39, in infer_shape
arg_shape, out_shape, aux_shape = self.sym.infer_shape(**data_shape_dict)
File "/mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/symbol/symbol.py", line 1103, in infer_shape
res = self._infer_shape_impl(False, *args, **kwargs)
File "/mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/symbol/symbol.py", line 1267, in _infer_shape_impl
ctypes.byref(complete)))
File "/mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/base.py", line 255, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
mxnet.base.MXNetError: Error in operator _plus2: [06:13:23] src/operator/contrib/./../elemwise_op_common.h:135: Check failed: assign(&dattr, vec.at(i)): Incompatible attr in node _plus2 at 1-th input: expected [313,16,300], got [19,16,18]
Stack trace:
[bt] (0) /mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x6b8b5b) [0x7f1934352b5b]
[bt] (1) /mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x878f39) [0x7f1934512f39]
[bt] (2) /mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x8797db) [0x7f19345137db]
[bt] (3) /mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/libmxnet.so(+0xb48036) [0x7f19347e2036]
[bt] (4) /mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x382fe3c) [0x7f19374c9e3c]
[bt] (5) /mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x38336a8) [0x7f19374cd6a8]
[bt] (6) /mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/libmxnet.so(+0x377bc31) [0x7f1937415c31]
[bt] (7) /mnt/keshav/relnet_python3/lib/python3.6/site-packages/mxnet/libmxnet.so(MXSymbolInferShapeEx+0xc1) [0x7f19374162c1]
[bt] (8) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call_unix64+0x4c) [0x7f1973c75dae]
'''
Can someone please explain what might the issue exactly be?
I'd like to train a Faster R-CNN + Relation module (as in the readme with coco) on my own dataset. How do I go about it?
There are several mAP in the paper, for example [email protected], [email protected], mAP@S, does any one know that what does it mean? Is it mean average precision, but if so, does it so low for the detection result? Because the mAP for the faster RCNN is more than 70%, but the best result for this paper is just around 50%. So I think it has different meaning. Many thanks.
I have train on my own dataset after changing the configure file. And everything is looking good. Here is the snapshot of training log.
Here is the snapshot of the test log.
I have two questions to ask.
Thanks in advance.
Title. Many thanks for the help!
Here is the location of this function call:
line 33 of
looks like: mx.sym.full((1,), wave_length)
Best,
Chu.
When I'm training on the coco as the README declared,I meet this problem just like the blod log,and then the NMSLoss_pos and the NMSLoss_neg become nan,does anyone meet the same problem and give me some help?
('lr', 0.0005, 'lr_epoch_diff', [5.33], 'lr_iters', [625027])
Epoch[0] Batch [100] Speed: 5.08 samples/sec Train-RPNAcc=0.847250, RPNLogLoss=0.376764, RPNL1Loss=0.187504, RCNNAcc=0.801361, RCNNLogLoss=1.674762, RCNNL1Loss=0.311297, NMSLoss_pos=0.035744, NMSLoss_neg=0.016391, NMSAcc_pos=0.000000, NMSAcc_neg=1.000000,
Epoch[0] Batch [200] Speed: 5.10 samples/sec Train-RPNAcc=0.865089, RPNLogLoss=0.328289, RPNL1Loss=0.176516, RCNNAcc=0.811237, RCNNLogLoss=1.380794, RCNNL1Loss=0.316205, NMSLoss_pos=0.048681, NMSLoss_neg=0.013534, NMSAcc_pos=0.000000, NMSAcc_neg=1.000000,
Epoch[0] Batch [300] Speed: 5.11 samples/sec Train-RPNAcc=0.874916, RPNLogLoss=0.302038, RPNL1Loss=0.159570, RCNNAcc=0.802546, RCNNLogLoss=1.319950, RCNNL1Loss=0.352934, NMSLoss_pos=0.057433, NMSLoss_neg=0.013499, NMSAcc_pos=0.000000, NMSAcc_neg=1.000000,
experiments/relation_rcnn/../../relation_rcnn/../lib/bbox/bbox_transform.py:128: RuntimeWarning: overflow encountered in exp
pred_w = np.exp(dw) * widths[:, np.newaxis]
experiments/relation_rcnn/../../relation_rcnn/../lib/bbox/bbox_transform.py:129: RuntimeWarning: overflow encountered in exp
pred_h = np.exp(dh) * heights[:, np.newaxis]
experiments/relation_rcnn/../../relation_rcnn/../lib/bbox/bbox_transform.py:133: RuntimeWarning: invalid value encountered in subtract
pred_boxes[:, 0::4] = pred_ctr_x - 0.5 * (pred_w - 1.0)
experiments/relation_rcnn/../../relation_rcnn/../lib/bbox/bbox_transform.py:135: RuntimeWarning: invalid value encountered in subtract
pred_boxes[:, 1::4] = pred_ctr_y - 0.5 * (pred_h - 1.0)
experiments/relation_rcnn/../../relation_rcnn/../lib/bbox/bbox_transform.py:137: RuntimeWarning: invalid value encountered in add
pred_boxes[:, 2::4] = pred_ctr_x + 0.5 * (pred_w - 1.0)
experiments/relation_rcnn/../../relation_rcnn/../lib/bbox/bbox_transform.py:139: RuntimeWarning: invalid value encountered in add
pred_boxes[:, 3::4] = pred_ctr_y + 0.5 * (pred_h - 1.0)
experiments/relation_rcnn/../../relation_rcnn/operator_py/proposal.py:180: RuntimeWarning: invalid value encountered in greater_equal
keep = np.where((ws >= min_size) & (hs >= min_size))[0]
Epoch[0] Batch [400] Speed: 5.02 samples/sec Train-RPNAcc=0.871289, RPNLogLoss=nan, RPNL1Loss=nan, RCNNAcc=0.810123, RCNNLogLoss=1.576645, RCNNL1Loss=0.334166, NMSLoss_pos=0.054120, NMSLoss_neg=nan, NMSAcc_pos=0.000000, NMSAcc_neg=0.999650,
Epoch[0] Batch [500] Speed: 4.91 samples/sec Train-RPNAcc=0.859804, RPNLogLoss=nan, RPNL1Loss=nan, RCNNAcc=0.836702, RCNNLogLoss=1.888214, RCNNL1Loss=0.267614, NMSLoss_pos=nan, NMSLoss_neg=nan, NMSAcc_pos=0.000000, NMSAcc_neg=0.999720,
Epoch[0] Batch [600] Speed: 4.99 samples/sec Train-RPNAcc=0.850682, RPNLogLoss=nan, RPNL1Loss=nan, RCNNAcc=0.853031, RCNNLogLoss=1.725999, RCNNL1Loss=0.223882, NMSLoss_pos=nan, NMSLoss_neg=nan, NMSAcc_pos=0.000000, NMSAcc_neg=0.999767,
Epoch[0] Batch [700] Speed: 4.98 samples/sec Train-RPNAcc=0.844466, RPNLogLoss=nan, RPNL1Loss=nan, RCNNAcc=0.865544, RCNNLogLoss=1.547918, RCNNL1Loss=0.192278, NMSLoss_pos=nan, NMSLoss_neg=nan, NMSAcc_pos=0.000000, NMSAcc_neg=0.999800,
Hi, I want to visualize the bounding box of a new image instead of datasets with annotations.
How can I modify test.py file?
compilation terminated.
Makefile:393: recipe for target 'build/src/operator/contrib/multibox_detection.o' failed
make: *** [build/src/operator/contrib/multibox_detection.o] Error 1
In file included from /home/fsr/Relation-Networks-for-Object-Detection-master/incubator-mxnet/mshadow/mshadow/tensor.h:16:0,
from include/mxnet/./base.h:32,
from include/mxnet/operator.h:38,
from src/operator/contrib/./deformable_psroi_pooling-inl.h:32,
from src/operator/contrib/deformable_psroi_pooling.cc:27:
/home/fsr/Relation-Networks-for-Object-Detection-master/incubator-mxnet/mshadow/mshadow/./base.h:147:23: fatal error: cblas.h: 没有那个文件或目录
compilation terminated.
Makefile:393: recipe for target 'build/src/operator/contrib/deformable_psroi_pooling.o' failed
make: *** [build/src/operator/contrib/deformable_psroi_pooling.o] Error 1
In file included from /home/fsr/Relation-Networks-for-Object-Detection-master/incubator-mxnet/mshadow/mshadow/tensor.h:16:0,
from include/mxnet/./base.h:32,
from include/mxnet/operator.h:38,
from src/operator/contrib/./psroi_pooling-inl.h:14,
from src/operator/contrib/psroi_pooling.cc:28:
/home/fsr/Relation-Networks-for-Object-Detection-master/incubator-mxnet/mshadow/mshadow/./base.h:147:23: fatal error: cblas.h: 没有那个文件或目录
compilation terminated.
Makefile:393: recipe for target 'build/src/operator/contrib/psroi_pooling.o' failed
make: *** [build/src/operator/contrib/psroi_pooling.o] Error 1
Is it essential to exclude groundtruth box when training with relation module?And why?
After training for several epoches, it raised the following error:
Epoch[7] Batch [2280] Speed: 3.99 samples/sec Train-RPNAcc=0.995009, RPNLogLoss=0.016979, RPNL1Loss=0.034412, RCNNAcc=0.797916, RCNNLogLoss=0.433694, RCNNL1Loss=0.423676,
Epoch[7] Batch [2300] Speed: 3.96 samples/sec Train-RPNAcc=0.995029, RPNLogLoss=0.016926, RPNL1Loss=0.034388, RCNNAcc=0.797866, RCNNLogLoss=0.433422, RCNNL1Loss=0.423580,
Epoch[7] Batch [2320] Speed: 3.69 samples/sec Train-RPNAcc=0.995011, RPNLogLoss=0.017027, RPNL1Loss=0.034292, RCNNAcc=0.798346, RCNNLogLoss=0.432565, RCNNL1Loss=0.422573,
[17:51:53] /home/fallingdust/workspace/mxnet/dmlc-core/include/dmlc/./logging.h:308: [17:51:53] src/engine/naive_engine.cc:168: Check failed: this->req_completed_ NaiveEngine only support synchronize Push so far
Stack trace returned 10 entries:
[bt] (0) /usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/libmxnet.so(_ZN5mxnet6engine11NaiveEngine9PushAsyncESt8functionIFvNS_10RunContextENS0_18CallbackOnCompleteEEENS_7ContextERKSt6vectorIPNS0_3VarESaISA_EESE_NS_10FnPropertyEiPKc+0x3b3) [0x7f921cb635a3]
[bt] (1) /usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/libmxnet.so(_ZN5mxnet6engine11NaiveEngine4PushEPNS0_3OprENS_7ContextEib+0x8f) [0x7f921cb644af]
[bt] (2) /usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/libmxnet.so(_ZN5mxnet4exec13GraphExecutor6RunOpsEbmm+0x724) [0x7f921cc08e84]
[bt] (3) /usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/libmxnet.so(MXExecutorForward+0x11) [0x7f921cb9ab81]
[bt] (4) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call_unix64+0x4c) [0x7f9230c02e40]
[bt] (5) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call+0x2eb) [0x7f9230c028ab]
[bt] (6) /usr/lib/python2.7/lib-dynload/_ctypes.x86_64-linux-gnu.so(_ctypes_callproc+0x48f) [0x7f9230e123df]
[bt] (7) /usr/lib/python2.7/lib-dynload/_ctypes.x86_64-linux-gnu.so(+0x11d82) [0x7f9230e16d82]
[bt] (8) python(PyObject_Call+0x43) [0x4b0c93]
[bt] (9) python(PyEval_EvalFrameEx+0x602f) [0x4c9f9f]
Traceback (most recent call last):
File "experiments/relation_rcnn/rcnn_end2end_train_test.py", line 21, in <module>
train_end2end.main()
File "experiments/relation_rcnn/../../relation_rcnn/train_end2end.py", line 193, in main
config.TRAIN.begin_epoch, config.TRAIN.end_epoch, config.TRAIN.lr, config.TRAIN.lr_step)
File "experiments/relation_rcnn/../../relation_rcnn/train_end2end.py", line 186, in train_net
arg_params=arg_params, aux_params=aux_params, begin_epoch=begin_epoch, num_epoch=end_epoch)
File "experiments/relation_rcnn/../../relation_rcnn/core/module.py", line 999, in fit
self.forward_backward(data_batch)
File "/usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/module/base_module.py", line 191, in forward_backward
self.forward(data_batch, is_train=True)
File "experiments/relation_rcnn/../../relation_rcnn/core/module.py", line 1074, in forward
self._curr_module.forward(data_batch, is_train=is_train)
File "experiments/relation_rcnn/../../relation_rcnn/core/module.py", line 554, in forward
self._exec_group.forward(data_batch, is_train)
File "experiments/relation_rcnn/../../relation_rcnn/core/DataParallelExecutorGroup.py", line 360, in forward
exec_.forward(is_train=is_train)
File "/usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/executor.py", line 150, in forward
ctypes.c_int(int(is_train))))
File "/usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/base.py", line 146, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
mxnet.base.MXNetError: [17:51:53] src/engine/naive_engine.cc:168: Check failed: this->req_completed_ NaiveEngine only support synchronize Push so far
Stack trace returned 10 entries:
[bt] (0) /usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/libmxnet.so(_ZN5mxnet6engine11NaiveEngine9PushAsyncESt8functionIFvNS_10RunContextENS0_18CallbackOnCompleteEEENS_7ContextERKSt6vectorIPNS0_3VarESaISA_EESE_NS_10FnPropertyEiPKc+0x3b3) [0x7f921cb635a3]
[bt] (1) /usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/libmxnet.so(_ZN5mxnet6engine11NaiveEngine4PushEPNS0_3OprENS_7ContextEib+0x8f) [0x7f921cb644af]
[bt] (2) /usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/libmxnet.so(_ZN5mxnet4exec13GraphExecutor6RunOpsEbmm+0x724) [0x7f921cc08e84]
[bt] (3) /usr/local/lib/python2.7/dist-packages/mxnet-1.0.0-py2.7.egg/mxnet/libmxnet.so(MXExecutorForward+0x11) [0x7f921cb9ab81]
[bt] (4) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call_unix64+0x4c) [0x7f9230c02e40]
[bt] (5) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call+0x2eb) [0x7f9230c028ab]
[bt] (6) /usr/lib/python2.7/lib-dynload/_ctypes.x86_64-linux-gnu.so(_ctypes_callproc+0x48f) [0x7f9230e123df]
[bt] (7) /usr/lib/python2.7/lib-dynload/_ctypes.x86_64-linux-gnu.so(+0x11d82) [0x7f9230e16d82]
[bt] (8) python(PyObject_Call+0x43) [0x4b0c93]
[bt] (9) python(PyEval_EvalFrameEx+0x602f) [0x4c9f9f]
[17:51:53] src/engine/naive_engine.cc:55: Engine shutdown
I trained the network on just one GPU using CUDA_VISIBLE_DEVICES=0
although I have two GPUs.
Please give me some advice or help to fix it. Thank you.
Thank you for wonderful code! std
and mean
of boxes are stored in params of last regression layer. When cfg.TRAIN.RESUME
is True
, they should be restored?
Can this project run on Windows?
hi! do you know how 'nongt_dim' works?
Hi,
I apply the resnet_v1_101_coco_trainvalminus_rcnn_dcn_end2end_relation_learn_nms_8epoch.yaml to train a model. Then using the trained model to inference the test image, I find there are multiple boxes appear on the same object. Do I not train the nms layer thoroughly? So I need to add nms to remove the duplicate boxes.
Hello ,
Does anyone know how to add a validation at the end of each epoch?
Thanks!!
Hi, thanks for your work! I have a question about greedy-NMS usage: do you completely replace NMS with your duplicate removal network or add it complementary to greedy-NMS? From Table 1 it seems that greedy-NMS is still a part of pipeline...
Hello, I am chaojie from Renmin University of China, Beijing.
Thanks for you excellent work in object detection.
But i have a problem when i try it with my own data,
It is all good for rcnn_attention and rcnn_dcn training and test, but when it comes rcnn_fpn symbol,
I have to provide the previously generated proposals in the directory like this "./proposal/resnet_v1_101_fpn/rpn_data/{}_{}_rpn.pkl", so could you please give tips to me about how to generate this file for another data with annotations like COCO. I am dying to know it, thank you very much!
Does it mean to propose first N proposals or first N detections for one image?
Configs in resnet_v1_101_coco_trainvalminus_rcnn_end2end_relation_learn_nms_8epoch.yaml
.
Hi, I have prepared all the data/pretrained model/environment setting, then I run the command below:
python experiments\relation_rcnn\rcnn_end2end_train_test.py --cfg experiments/relation_rcnn/cfgs/resnet_v1_101_coco_trainvalminus_rcnn_end2end_relation_learn_nms_8epoch.yamllminus_rcnn_end2end_relation_learn_nms_8epoch.yaml
However, I got the error message below:
After I uninstalled and reinstalled numpy
, it turned to be wrong with skimage
.
When I run sh ./init.sh, an error comes
x86_64-linux-gnu-gcc: error: maskApi.c: No such file or directory
And I have installed cython.
It seems when run in lib/ dataset/pycocotools/
python setup_linux.py build_ext --inplace
I recurrent the code based on vgg16 and I found that the map is the same as the faster-rcnn . Does the relation module works based on vgg16? thank you!
I don't know what the meaning of "nongt_dim" is in extract_position_embedding and attention_module_embedding. If it means "the number of rois that are not ground truth", how to determine which roi is not ground truth during training. Maybe I misunderstand it. Could anyone help me?
Hi,
just found the one drive link for the pre-trained models: res101, faster rcnn and FPN is not working with ERR_ADDRESS_UNREACHABLE
Thanks.
/home/dlc/anaconda3/envs/torch3_py27/bin/python2.7 /home/dlc/sll/Relation-Networks-for-Object-Detection-master/experiments/relation_rcnn/rcnn_end2end_train_test.py --cfg experiments/relation_rcnn/cfgs/resnet_v1_101_coco_trainvalminus_rcnn_fpn_relation_learn_nms_8epoch.yaml
('Called with argument:', Namespace(cfg='experiments/relation_rcnn/cfgs/resnet_v1_101_coco_trainvalminus_rcnn_fpn_relation_learn_nms_8epoch.yaml', frequent=100))
Traceback (most recent call last):
File "/home/dlc/sll/Relation-Networks-for-Object-Detection-master/experiments/relation_rcnn/rcnn_end2end_train_test.py", line 21, in
train_end2end.main()
File "/home/dlc/sll/Relation-Networks-for-Object-Detection-master/experiments/relation_rcnn/../../relation_rcnn/train_end2end.py", line 184, in main
config.TRAIN.model_prefix,config.TRAIN.begin_epoch, config.TRAIN.end_epoch, config.TRAIN.lr, config.TRAIN.lr_step)
File "/home/dlc/sll/Relation-Networks-for-Object-Detection-master/experiments/relation_rcnn/../../relation_rcnn/train_end2end.py", line 66, in train_net
sym = sym_instance.get_symbol(config, is_train=True)
File "/home/dlc/sll/Relation-Networks-for-Object-Detection-master/experiments/relation_rcnn/../../relation_rcnn/../lib/utils/symbol.py", line 25, in get_symbol
raise NotImplementedError()
NotImplementedError
跪求解决方法呜呜~~
hi! rois -> sliced_rois
sliced_rois = mx.sym.slice_axis(rois, axis=1, begin=1, end=None)
why "sliced_rois" start from 1?
I don't understand codes below, in
resnet_v1_101_rcnn_attention_1024_pairwise_position_multi_head_16_learn_nms.py
You added one of Relation Module
inputs and Relation Module
output together as the input of later layer. But you did not explain why you do this in your paper.
nms_attention_1, nms_softmax_1 = self.attention_module_nms_multi_head(
nms_embedding_feat, nms_position_matrix,
num_rois=first_n, index=1, group=16,
dim=(1024, 1024, 128), fc_dim=(64, 16), feat_dim=128
)
nms_all_feat_1 = nms_embedding_feat + nms_attention_1
I think that the network figure should add the green path
btw, nms_softmax_1
is redundant, I personally recommend deleting it in your release code.
Could you please show us how to arrange COCO dataset's directory layout(including train/val/test)?
I found it different between your specification in cfg file and official COCO dataset, e.g. valminusminival2014 and minival2014.
The annotation files I pulled from COCO's official site include annotations_trainval2014.zip
and image_info_test2015.zip
. The content inside zip file looks like this for trainval2014
and test2015
respectively:
Hi,
there is an error when calling output_shapes (https://github.com/msracver/Relation-Networks-for-Object-Detection/blob/master/relation_rcnn/core/module.py#L216)
File "experiments/relation_rcnn/rcnn_end2end_train_test.py", line 21, in <module>
train_end2end.main()
File "experiments/relation_rcnn/../../relation_rcnn/train_end2end.py", line 184, in main
config.TRAIN.begin_epoch, config.TRAIN.end_epoch, config.TRAIN.lr, config.TRAIN.lr_step)
File "experiments/relation_rcnn/../../relation_rcnn/train_end2end.py", line 177, in train_net
arg_params=arg_params, aux_params=aux_params, begin_epoch=begin_epoch, num_epoch=end_epoch)
File "experiments/relation_rcnn/../../relation_rcnn/core/module.py", line 1001, in fit
print ('output shape {}'.format(self.output_shapes))
File "experiments/relation_rcnn/../../relation_rcnn/core/module.py", line 801, in output_shapes
return self._curr_module.output_shapes
File "experiments/relation_rcnn/../../relation_rcnn/core/module.py", line 223, in output_shapes
return self._exec_group.get_output_shapes()
AttributeError: 'DataParallelExecutorGroup' object has no attribute 'get_output_shapes'
Any advise?
Hi! This is a great work!
But I wonder why the FPN baseline (2FC + softnms(0.6) ResNet-101) 36.8 mAP is lower than the Detectron FPN baseline (R-101-FPN) 38.5 mAP ? You both use ResNet101 and pre-computed proposals. Is there anything different in implementation?
Because there are overlap bboxes in the same object, when compute the mAP, whether the overlap bboxes influence the mAP or not?
Will you provide support for voc format dataset?
I can't download pretrained model ResNet-v1-101 from https://1drv.ms/u/s!Am-5JzdW2XHzhqpCxvNTMZDlcDTpSA
, can you check this link or provide another link?
2018-06-26 10:15:50,538 {'bbox_pred_bias': (8L,),
'bbox_pred_weight': (8L, 1024L),
'bbox_target': (1L, 48L, 38L, 38L),
'bbox_weight': (1L, 48L, 38L, 38L),
'bn2a_branch1_beta': (256L,),
'bn2a_branch1_gamma': (256L,),
'bn2a_branch2a_beta': (64L,),
'bn2a_branch2a_gamma': (64L,),
'bn2a_branch2b_beta': (64L,),
'bn2a_branch2b_gamma': (64L,),
'bn2a_branch2c_beta': (256L,),
'bn2a_branch2c_gamma': (256L,),
'bn2b_branch2a_beta': (64L,),
'bn2b_branch2a_gamma': (64L,),
'bn2b_branch2b_beta': (64L,),
'bn2b_branch2b_gamma': (64L,),
'bn2b_branch2c_beta': (256L,),
'bn2b_branch2c_gamma': (256L,),
'bn2c_branch2a_beta': (64L,),
'bn2c_branch2a_gamma': (64L,),
'bn2c_branch2b_beta': (64L,),
'bn2c_branch2b_gamma': (64L,),
'bn2c_branch2c_beta': (256L,),
'bn2c_branch2c_gamma': (256L,),
'bn3a_branch1_beta': (512L,),
'bn3a_branch1_gamma': (512L,),
'bn3a_branch2a_beta': (128L,),
'bn3a_branch2a_gamma': (128L,),
'bn3a_branch2b_beta': (128L,),
'bn3a_branch2b_gamma': (128L,),
'bn3a_branch2c_beta': (512L,),
'bn3a_branch2c_gamma': (512L,),
'bn3b1_branch2a_beta': (128L,),
'bn3b1_branch2a_gamma': (128L,),
'bn3b1_branch2b_beta': (128L,),
'bn3b1_branch2b_gamma': (128L,),
'bn3b1_branch2c_beta': (512L,),
'bn3b1_branch2c_gamma': (512L,),
'bn3b2_branch2a_beta': (128L,),
'bn3b2_branch2a_gamma': (128L,),
'bn3b2_branch2b_beta': (128L,),
'bn3b2_branch2b_gamma': (128L,),
'bn3b2_branch2c_beta': (512L,),
'bn3b2_branch2c_gamma': (512L,),
'bn3b3_branch2a_beta': (128L,),
'bn3b3_branch2a_gamma': (128L,),
'bn3b3_branch2b_beta': (128L,),
'bn3b3_branch2b_gamma': (128L,),
'bn3b3_branch2c_beta': (512L,),
'bn3b3_branch2c_gamma': (512L,),
'bn4a_branch1_beta': (1024L,),
'bn4a_branch1_gamma': (1024L,),
'bn4a_branch2a_beta': (256L,),
'bn4a_branch2a_gamma': (256L,),
'bn4a_branch2b_beta': (256L,),
'bn4a_branch2b_gamma': (256L,),
'bn4a_branch2c_beta': (1024L,),
'bn4a_branch2c_gamma': (1024L,),
'bn4b10_branch2a_beta': (256L,),
'bn4b10_branch2a_gamma': (256L,),
'bn4b10_branch2b_beta': (256L,),
'bn4b10_branch2b_gamma': (256L,),
'bn4b10_branch2c_beta': (1024L,),
'bn4b10_branch2c_gamma': (1024L,),
'bn4b11_branch2a_beta': (256L,),
'bn4b11_branch2a_gamma': (256L,),
'bn4b11_branch2b_beta': (256L,),
'bn4b11_branch2b_gamma': (256L,),
'bn4b11_branch2c_beta': (1024L,),
'bn4b11_branch2c_gamma': (1024L,),
'bn4b12_branch2a_beta': (256L,),
'bn4b12_branch2a_gamma': (256L,),
'bn4b12_branch2b_beta': (256L,),
'bn4b12_branch2b_gamma': (256L,),
'bn4b12_branch2c_beta': (1024L,),
'bn4b12_branch2c_gamma': (1024L,),
'bn4b13_branch2a_beta': (256L,),
'bn4b13_branch2a_gamma': (256L,),
'bn4b13_branch2b_beta': (256L,),
'bn4b13_branch2b_gamma': (256L,),
'bn4b13_branch2c_beta': (1024L,),
'bn4b13_branch2c_gamma': (1024L,),
'bn4b14_branch2a_beta': (256L,),
'bn4b14_branch2a_gamma': (256L,),
'bn4b14_branch2b_beta': (256L,),
'bn4b14_branch2b_gamma': (256L,),
'bn4b14_branch2c_beta': (1024L,),
'bn4b14_branch2c_gamma': (1024L,),
'bn4b15_branch2a_beta': (256L,),
'bn4b15_branch2a_gamma': (256L,),
'bn4b15_branch2b_beta': (256L,),
'bn4b15_branch2b_gamma': (256L,),
'bn4b15_branch2c_beta': (1024L,),
'bn4b15_branch2c_gamma': (1024L,),
'bn4b16_branch2a_beta': (256L,),
'bn4b16_branch2a_gamma': (256L,),
'bn4b16_branch2b_beta': (256L,),
'bn4b16_branch2b_gamma': (256L,),
'bn4b16_branch2c_beta': (1024L,),
'bn4b16_branch2c_gamma': (1024L,),
'bn4b17_branch2a_beta': (256L,),
'bn4b17_branch2a_gamma': (256L,),
'bn4b17_branch2b_beta': (256L,),
'bn4b17_branch2b_gamma': (256L,),
'bn4b17_branch2c_beta': (1024L,),
'bn4b17_branch2c_gamma': (1024L,),
'bn4b18_branch2a_beta': (256L,),
'bn4b18_branch2a_gamma': (256L,),
'bn4b18_branch2b_beta': (256L,),
'bn4b18_branch2b_gamma': (256L,),
'bn4b18_branch2c_beta': (1024L,),
'bn4b18_branch2c_gamma': (1024L,),
'bn4b19_branch2a_beta': (256L,),
'bn4b19_branch2a_gamma': (256L,),
'bn4b19_branch2b_beta': (256L,),
'bn4b19_branch2b_gamma': (256L,),
'bn4b19_branch2c_beta': (1024L,),
'bn4b19_branch2c_gamma': (1024L,),
'bn4b1_branch2a_beta': (256L,),
'bn4b1_branch2a_gamma': (256L,),
'bn4b1_branch2b_beta': (256L,),
'bn4b1_branch2b_gamma': (256L,),
'bn4b1_branch2c_beta': (1024L,),
'bn4b1_branch2c_gamma': (1024L,),
'bn4b20_branch2a_beta': (256L,),
'bn4b20_branch2a_gamma': (256L,),
'bn4b20_branch2b_beta': (256L,),
'bn4b20_branch2b_gamma': (256L,),
'bn4b20_branch2c_beta': (1024L,),
'bn4b20_branch2c_gamma': (1024L,),
'bn4b21_branch2a_beta': (256L,),
'bn4b21_branch2a_gamma': (256L,),
'bn4b21_branch2b_beta': (256L,),
'bn4b21_branch2b_gamma': (256L,),
'bn4b21_branch2c_beta': (1024L,),
'bn4b21_branch2c_gamma': (1024L,),
'bn4b22_branch2a_beta': (256L,),
'bn4b22_branch2a_gamma': (256L,),
'bn4b22_branch2b_beta': (256L,),
'bn4b22_branch2b_gamma': (256L,),
'bn4b22_branch2c_beta': (1024L,),
'bn4b22_branch2c_gamma': (1024L,),
'bn4b2_branch2a_beta': (256L,),
'bn4b2_branch2a_gamma': (256L,),
'bn4b2_branch2b_beta': (256L,),
'bn4b2_branch2b_gamma': (256L,),
'bn4b2_branch2c_beta': (1024L,),
'bn4b2_branch2c_gamma': (1024L,),
'bn4b3_branch2a_beta': (256L,),
'bn4b3_branch2a_gamma': (256L,),
'bn4b3_branch2b_beta': (256L,),
'bn4b3_branch2b_gamma': (256L,),
'bn4b3_branch2c_beta': (1024L,),
'bn4b3_branch2c_gamma': (1024L,),
'bn4b4_branch2a_beta': (256L,),
'bn4b4_branch2a_gamma': (256L,),
'bn4b4_branch2b_beta': (256L,),
'bn4b4_branch2b_gamma': (256L,),
'bn4b4_branch2c_beta': (1024L,),
'bn4b4_branch2c_gamma': (1024L,),
'bn4b5_branch2a_beta': (256L,),
'bn4b5_branch2a_gamma': (256L,),
'bn4b5_branch2b_beta': (256L,),
'bn4b5_branch2b_gamma': (256L,),
'bn4b5_branch2c_beta': (1024L,),
'bn4b5_branch2c_gamma': (1024L,),
'bn4b6_branch2a_beta': (256L,),
'bn4b6_branch2a_gamma': (256L,),
'bn4b6_branch2b_beta': (256L,),
'bn4b6_branch2b_gamma': (256L,),
'bn4b6_branch2c_beta': (1024L,),
'bn4b6_branch2c_gamma': (1024L,),
'bn4b7_branch2a_beta': (256L,),
'bn4b7_branch2a_gamma': (256L,),
'bn4b7_branch2b_beta': (256L,),
'bn4b7_branch2b_gamma': (256L,),
'bn4b7_branch2c_beta': (1024L,),
'bn4b7_branch2c_gamma': (1024L,),
'bn4b8_branch2a_beta': (256L,),
'bn4b8_branch2a_gamma': (256L,),
'bn4b8_branch2b_beta': (256L,),
'bn4b8_branch2b_gamma': (256L,),
'bn4b8_branch2c_beta': (1024L,),
'bn4b8_branch2c_gamma': (1024L,),
'bn4b9_branch2a_beta': (256L,),
'bn4b9_branch2a_gamma': (256L,),
'bn4b9_branch2b_beta': (256L,),
'bn4b9_branch2b_gamma': (256L,),
'bn4b9_branch2c_beta': (1024L,),
'bn4b9_branch2c_gamma': (1024L,),
'bn5a_branch1_beta': (2048L,),
'bn5a_branch1_gamma': (2048L,),
'bn5a_branch2a_beta': (512L,),
'bn5a_branch2a_gamma': (512L,),
'bn5a_branch2b_beta': (512L,),
'bn5a_branch2b_gamma': (512L,),
'bn5a_branch2c_beta': (2048L,),
'bn5a_branch2c_gamma': (2048L,),
'bn5b_branch2a_beta': (512L,),
'bn5b_branch2a_gamma': (512L,),
'bn5b_branch2b_beta': (512L,),
'bn5b_branch2b_gamma': (512L,),
'bn5b_branch2c_beta': (2048L,),
'bn5b_branch2c_gamma': (2048L,),
'bn5c_branch2a_beta': (512L,),
'bn5c_branch2a_gamma': (512L,),
'bn5c_branch2b_beta': (512L,),
'bn5c_branch2b_gamma': (512L,),
'bn5c_branch2c_beta': (2048L,),
'bn5c_branch2c_gamma': (2048L,),
'bn_conv1_beta': (64L,),
'bn_conv1_gamma': (64L,),
'cls_score_bias': (11L,),
'cls_score_weight': (11L, 1024L),
'conv1_weight': (64L, 3L, 7L, 7L),
'conv_new_1_bias': (256L,),
'conv_new_1_weight': (256L, 2048L, 1L, 1L),
'data': (1L, 3L, 600L, 600L),
'fc_new_1_bias': (1024L,),
'fc_new_1_weight': (1024L, 12544L),
'fc_new_2_bias': (1024L,),
'fc_new_2_weight': (1024L, 1024L),
'gt_boxes': (1L, 19L, 5L),
'im_info': (1L, 3L),
'key_1_bias': (1024L,),
'key_1_weight': (1024L, 1024L),
'key_2_bias': (1024L,),
'key_2_weight': (1024L, 1024L),
'label': (1L, 17328L),
'linear_out_1_bias': (1024L,),
'linear_out_1_weight': (1024L, 1024L, 1L, 1L),
'linear_out_2_bias': (1024L,),
'linear_out_2_weight': (1024L, 1024L, 1L, 1L),
'nms_key_1_bias': (1024L,),
'nms_key_1_weight': (1024L, 128L),
'nms_linear_out_1_bias': (128L,),
'nms_linear_out_1_weight': (128L, 128L, 1L, 1L),
'nms_logit_bias': (5L,),
'nms_logit_weight': (5L, 128L),
'nms_pair_pos_fc1_1_bias': (16L,),
'nms_pair_pos_fc1_1_weight': (16L, 64L),
'nms_query_1_bias': (1024L,),
'nms_query_1_weight': (1024L, 128L),
'nms_rank_bias': (128L,),
'nms_rank_weight': (128L, 1024L),
'pair_pos_fc1_1_bias': (16L,),
'pair_pos_fc1_1_weight': (16L, 64L),
'pair_pos_fc1_2_bias': (16L,),
'pair_pos_fc1_2_weight': (16L, 64L),
'query_1_bias': (1024L,),
'query_1_weight': (1024L, 1024L),
'query_2_bias': (1024L,),
'query_2_weight': (1024L, 1024L),
'res2a_branch1_weight': (256L, 64L, 1L, 1L),
'res2a_branch2a_weight': (64L, 64L, 1L, 1L),
'res2a_branch2b_weight': (64L, 64L, 3L, 3L),
'res2a_branch2c_weight': (256L, 64L, 1L, 1L),
'res2b_branch2a_weight': (64L, 256L, 1L, 1L),
'res2b_branch2b_weight': (64L, 64L, 3L, 3L),
'res2b_branch2c_weight': (256L, 64L, 1L, 1L),
'res2c_branch2a_weight': (64L, 256L, 1L, 1L),
'res2c_branch2b_weight': (64L, 64L, 3L, 3L),
'res2c_branch2c_weight': (256L, 64L, 1L, 1L),
'res3a_branch1_weight': (512L, 256L, 1L, 1L),
'res3a_branch2a_weight': (128L, 256L, 1L, 1L),
'res3a_branch2b_weight': (128L, 128L, 3L, 3L),
'res3a_branch2c_weight': (512L, 128L, 1L, 1L),
'res3b1_branch2a_weight': (128L, 512L, 1L, 1L),
'res3b1_branch2b_weight': (128L, 128L, 3L, 3L),
'res3b1_branch2c_weight': (512L, 128L, 1L, 1L),
'res3b2_branch2a_weight': (128L, 512L, 1L, 1L),
'res3b2_branch2b_weight': (128L, 128L, 3L, 3L),
'res3b2_branch2c_weight': (512L, 128L, 1L, 1L),
'res3b3_branch2a_weight': (128L, 512L, 1L, 1L),
'res3b3_branch2b_weight': (128L, 128L, 3L, 3L),
'res3b3_branch2c_weight': (512L, 128L, 1L, 1L),
'res4a_branch1_weight': (1024L, 512L, 1L, 1L),
'res4a_branch2a_weight': (256L, 512L, 1L, 1L),
'res4a_branch2b_weight': (256L, 256L, 3L, 3L),
'res4a_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b10_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b10_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b10_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b11_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b11_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b11_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b12_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b12_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b12_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b13_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b13_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b13_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b14_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b14_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b14_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b15_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b15_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b15_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b16_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b16_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b16_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b17_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b17_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b17_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b18_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b18_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b18_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b19_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b19_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b19_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b1_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b1_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b1_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b20_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b20_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b20_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b21_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b21_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b21_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b22_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b22_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b22_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b2_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b2_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b2_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b3_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b3_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b3_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b4_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b4_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b4_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b5_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b5_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b5_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b6_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b6_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b6_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b7_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b7_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b7_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b8_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b8_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b8_branch2c_weight': (1024L, 256L, 1L, 1L),
'res4b9_branch2a_weight': (256L, 1024L, 1L, 1L),
'res4b9_branch2b_weight': (256L, 256L, 3L, 3L),
'res4b9_branch2c_weight': (1024L, 256L, 1L, 1L),
'res5a_branch1_weight': (2048L, 1024L, 1L, 1L),
'res5a_branch2a_weight': (512L, 1024L, 1L, 1L),
'res5a_branch2b_weight': (512L, 512L, 3L, 3L),
'res5a_branch2c_weight': (2048L, 512L, 1L, 1L),
'res5b_branch2a_weight': (512L, 2048L, 1L, 1L),
'res5b_branch2b_weight': (512L, 512L, 3L, 3L),
'res5b_branch2c_weight': (2048L, 512L, 1L, 1L),
'res5c_branch2a_weight': (512L, 2048L, 1L, 1L),
'res5c_branch2b_weight': (512L, 512L, 3L, 3L),
'res5c_branch2c_weight': (2048L, 512L, 1L, 1L),
'roi_feat_embedding_bias': (128L,),
'roi_feat_embedding_weight': (128L, 1024L),
'rpn_bbox_pred_bias': (48L,),
'rpn_bbox_pred_weight': (48L, 512L, 1L, 1L),
'rpn_cls_score_bias': (24L,),
'rpn_cls_score_weight': (24L, 512L, 1L, 1L),
'rpn_conv_3x3_bias': (512L,),
'rpn_conv_3x3_weight': (512L, 1024L, 3L, 3L)}
2018-06-26 10:15:51,964 conv1_weight is fixed.
2018-06-26 10:15:51,965 bn_conv1_gamma is fixed.
2018-06-26 10:15:51,966 bn_conv1_beta is fixed.
2018-06-26 10:15:51,966 res2a_branch1_weight is fixed.
2018-06-26 10:15:51,966 bn2a_branch1_gamma is fixed.
2018-06-26 10:15:51,966 bn2a_branch1_beta is fixed.
2018-06-26 10:15:51,967 res2a_branch2a_weight is fixed.
2018-06-26 10:15:51,967 bn2a_branch2a_gamma is fixed.
2018-06-26 10:15:51,967 bn2a_branch2a_beta is fixed.
2018-06-26 10:15:51,967 res2a_branch2b_weight is fixed.
2018-06-26 10:15:51,967 bn2a_branch2b_gamma is fixed.
2018-06-26 10:15:51,967 bn2a_branch2b_beta is fixed.
2018-06-26 10:15:51,967 res2a_branch2c_weight is fixed.
2018-06-26 10:15:51,967 bn2a_branch2c_gamma is fixed.
2018-06-26 10:15:51,967 bn2a_branch2c_beta is fixed.
2018-06-26 10:15:51,968 res2b_branch2a_weight is fixed.
2018-06-26 10:15:51,968 bn2b_branch2a_gamma is fixed.
2018-06-26 10:15:51,968 bn2b_branch2a_beta is fixed.
2018-06-26 10:15:51,968 res2b_branch2b_weight is fixed.
2018-06-26 10:15:51,968 bn2b_branch2b_gamma is fixed.
2018-06-26 10:15:51,968 bn2b_branch2b_beta is fixed.
2018-06-26 10:15:51,968 res2b_branch2c_weight is fixed.
2018-06-26 10:15:51,968 bn2b_branch2c_gamma is fixed.
2018-06-26 10:15:51,969 bn2b_branch2c_beta is fixed.
2018-06-26 10:15:51,969 res2c_branch2a_weight is fixed.
2018-06-26 10:15:51,969 bn2c_branch2a_gamma is fixed.
2018-06-26 10:15:51,969 bn2c_branch2a_beta is fixed.
2018-06-26 10:15:51,969 res2c_branch2b_weight is fixed.
2018-06-26 10:15:51,969 bn2c_branch2b_gamma is fixed.
2018-06-26 10:15:51,969 bn2c_branch2b_beta is fixed.
2018-06-26 10:15:51,969 res2c_branch2c_weight is fixed.
2018-06-26 10:15:51,969 bn2c_branch2c_gamma is fixed.
2018-06-26 10:15:51,970 bn2c_branch2c_beta is fixed.
2018-06-26 10:15:51,970 bn3a_branch1_gamma is fixed.
2018-06-26 10:15:51,970 bn3a_branch1_beta is fixed.
2018-06-26 10:15:51,970 bn3a_branch2a_gamma is fixed.
2018-06-26 10:15:51,970 bn3a_branch2a_beta is fixed.
2018-06-26 10:15:51,970 bn3a_branch2b_gamma is fixed.
2018-06-26 10:15:51,970 bn3a_branch2b_beta is fixed.
2018-06-26 10:15:51,970 bn3a_branch2c_gamma is fixed.
2018-06-26 10:15:51,970 bn3a_branch2c_beta is fixed.
2018-06-26 10:15:51,970 bn3b1_branch2a_gamma is fixed.
2018-06-26 10:15:51,970 bn3b1_branch2a_beta is fixed.
2018-06-26 10:15:51,970 bn3b1_branch2b_gamma is fixed.
2018-06-26 10:15:51,970 bn3b1_branch2b_beta is fixed.
2018-06-26 10:15:51,970 bn3b1_branch2c_gamma is fixed.
2018-06-26 10:15:51,970 bn3b1_branch2c_beta is fixed.
2018-06-26 10:15:51,970 bn3b2_branch2a_gamma is fixed.
2018-06-26 10:15:51,970 bn3b2_branch2a_beta is fixed.
2018-06-26 10:15:51,970 bn3b2_branch2b_gamma is fixed.
2018-06-26 10:15:51,971 bn3b2_branch2b_beta is fixed.
2018-06-26 10:15:51,971 bn3b2_branch2c_gamma is fixed.
2018-06-26 10:15:51,971 bn3b2_branch2c_beta is fixed.
2018-06-26 10:15:51,971 bn3b3_branch2a_gamma is fixed.
2018-06-26 10:15:51,971 bn3b3_branch2a_beta is fixed.
2018-06-26 10:15:51,971 bn3b3_branch2b_gamma is fixed.
2018-06-26 10:15:51,971 bn3b3_branch2b_beta is fixed.
2018-06-26 10:15:51,971 bn3b3_branch2c_gamma is fixed.
2018-06-26 10:15:51,971 bn3b3_branch2c_beta is fixed.
2018-06-26 10:15:51,971 bn4a_branch1_gamma is fixed.
2018-06-26 10:15:51,971 bn4a_branch1_beta is fixed.
2018-06-26 10:15:51,971 bn4a_branch2a_gamma is fixed.
2018-06-26 10:15:51,971 bn4a_branch2a_beta is fixed.
2018-06-26 10:15:51,971 bn4a_branch2b_gamma is fixed.
2018-06-26 10:15:51,972 bn4a_branch2b_beta is fixed.
2018-06-26 10:15:51,972 bn4a_branch2c_gamma is fixed.
2018-06-26 10:15:51,972 bn4a_branch2c_beta is fixed.
2018-06-26 10:15:51,972 bn4b1_branch2a_gamma is fixed.
2018-06-26 10:15:51,972 bn4b1_branch2a_beta is fixed.
2018-06-26 10:15:51,972 bn4b1_branch2b_gamma is fixed.
2018-06-26 10:15:51,972 bn4b1_branch2b_beta is fixed.
2018-06-26 10:15:51,972 bn4b1_branch2c_gamma is fixed.
2018-06-26 10:15:51,972 bn4b1_branch2c_beta is fixed.
2018-06-26 10:15:51,972 bn4b2_branch2a_gamma is fixed.
2018-06-26 10:15:51,972 bn4b2_branch2a_beta is fixed.
2018-06-26 10:15:51,972 bn4b2_branch2b_gamma is fixed.
2018-06-26 10:15:51,972 bn4b2_branch2b_beta is fixed.
2018-06-26 10:15:51,972 bn4b2_branch2c_gamma is fixed.
2018-06-26 10:15:51,973 bn4b2_branch2c_beta is fixed.
2018-06-26 10:15:51,973 bn4b3_branch2a_gamma is fixed.
2018-06-26 10:15:51,973 bn4b3_branch2a_beta is fixed.
2018-06-26 10:15:51,973 bn4b3_branch2b_gamma is fixed.
2018-06-26 10:15:51,973 bn4b3_branch2b_beta is fixed.
2018-06-26 10:15:51,973 bn4b3_branch2c_gamma is fixed.
2018-06-26 10:15:51,973 bn4b3_branch2c_beta is fixed.
2018-06-26 10:15:51,973 bn4b4_branch2a_gamma is fixed.
2018-06-26 10:15:51,973 bn4b4_branch2a_beta is fixed.
2018-06-26 10:15:51,973 bn4b4_branch2b_gamma is fixed.
2018-06-26 10:15:51,973 bn4b4_branch2b_beta is fixed.
2018-06-26 10:15:51,973 bn4b4_branch2c_gamma is fixed.
2018-06-26 10:15:51,975 bn4b4_branch2c_beta is fixed.
2018-06-26 10:15:51,976 bn4b5_branch2a_gamma is fixed.
2018-06-26 10:15:51,976 bn4b5_branch2a_beta is fixed.
2018-06-26 10:15:51,976 bn4b5_branch2b_gamma is fixed.
2018-06-26 10:15:51,976 bn4b5_branch2b_beta is fixed.
2018-06-26 10:15:51,976 bn4b5_branch2c_gamma is fixed.
2018-06-26 10:15:51,978 bn4b5_branch2c_beta is fixed.
2018-06-26 10:15:51,979 bn4b6_branch2a_gamma is fixed.
2018-06-26 10:15:51,979 bn4b6_branch2a_beta is fixed.
2018-06-26 10:15:51,979 bn4b6_branch2b_gamma is fixed.
2018-06-26 10:15:51,979 bn4b6_branch2b_beta is fixed.
2018-06-26 10:15:51,979 bn4b6_branch2c_gamma is fixed.
2018-06-26 10:15:51,979 bn4b6_branch2c_beta is fixed.
2018-06-26 10:15:51,979 bn4b7_branch2a_gamma is fixed.
2018-06-26 10:15:51,980 bn4b7_branch2a_beta is fixed.
2018-06-26 10:15:51,980 bn4b7_branch2b_gamma is fixed.
2018-06-26 10:15:51,980 bn4b7_branch2b_beta is fixed.
2018-06-26 10:15:51,980 bn4b7_branch2c_gamma is fixed.
2018-06-26 10:15:51,980 bn4b7_branch2c_beta is fixed.
2018-06-26 10:15:51,980 bn4b8_branch2a_gamma is fixed.
2018-06-26 10:15:51,981 bn4b8_branch2a_beta is fixed.
2018-06-26 10:15:51,981 bn4b8_branch2b_gamma is fixed.
2018-06-26 10:15:51,981 bn4b8_branch2b_beta is fixed.
2018-06-26 10:15:51,981 bn4b8_branch2c_gamma is fixed.
2018-06-26 10:15:51,981 bn4b8_branch2c_beta is fixed.
2018-06-26 10:15:51,981 bn4b9_branch2a_gamma is fixed.
2018-06-26 10:15:51,981 bn4b9_branch2a_beta is fixed.
2018-06-26 10:15:51,982 bn4b9_branch2b_gamma is fixed.
2018-06-26 10:15:51,982 bn4b9_branch2b_beta is fixed.
2018-06-26 10:15:51,982 bn4b9_branch2c_gamma is fixed.
2018-06-26 10:15:51,982 bn4b9_branch2c_beta is fixed.
2018-06-26 10:15:51,982 bn4b10_branch2a_gamma is fixed.
2018-06-26 10:15:51,982 bn4b10_branch2a_beta is fixed.
2018-06-26 10:15:51,982 bn4b10_branch2b_gamma is fixed.
2018-06-26 10:15:51,982 bn4b10_branch2b_beta is fixed.
2018-06-26 10:15:51,983 bn4b10_branch2c_gamma is fixed.
2018-06-26 10:15:51,983 bn4b10_branch2c_beta is fixed.
2018-06-26 10:15:51,983 bn4b11_branch2a_gamma is fixed.
2018-06-26 10:15:51,983 bn4b11_branch2a_beta is fixed.
2018-06-26 10:15:51,983 bn4b11_branch2b_gamma is fixed.
2018-06-26 10:15:51,983 bn4b11_branch2b_beta is fixed.
2018-06-26 10:15:51,983 bn4b11_branch2c_gamma is fixed.
2018-06-26 10:15:51,983 bn4b11_branch2c_beta is fixed.
2018-06-26 10:15:51,983 bn4b12_branch2a_gamma is fixed.
2018-06-26 10:15:51,984 bn4b12_branch2a_beta is fixed.
2018-06-26 10:15:51,984 bn4b12_branch2b_gamma is fixed.
2018-06-26 10:15:51,984 bn4b12_branch2b_beta is fixed.
2018-06-26 10:15:51,984 bn4b12_branch2c_gamma is fixed.
2018-06-26 10:15:51,984 bn4b12_branch2c_beta is fixed.
2018-06-26 10:15:51,986 bn4b13_branch2a_gamma is fixed.
2018-06-26 10:15:51,987 bn4b13_branch2a_beta is fixed.
2018-06-26 10:15:51,987 bn4b13_branch2b_gamma is fixed.
2018-06-26 10:15:51,987 bn4b13_branch2b_beta is fixed.
2018-06-26 10:15:51,987 bn4b13_branch2c_gamma is fixed.
2018-06-26 10:15:51,987 bn4b13_branch2c_beta is fixed.
2018-06-26 10:15:51,987 bn4b14_branch2a_gamma is fixed.
2018-06-26 10:15:51,988 bn4b14_branch2a_beta is fixed.
2018-06-26 10:15:51,988 bn4b14_branch2b_gamma is fixed.
2018-06-26 10:15:51,988 bn4b14_branch2b_beta is fixed.
2018-06-26 10:15:51,988 bn4b14_branch2c_gamma is fixed.
2018-06-26 10:15:51,988 bn4b14_branch2c_beta is fixed.
2018-06-26 10:15:51,988 bn4b15_branch2a_gamma is fixed.
2018-06-26 10:15:51,988 bn4b15_branch2a_beta is fixed.
2018-06-26 10:15:51,988 bn4b15_branch2b_gamma is fixed.
2018-06-26 10:15:51,989 bn4b15_branch2b_beta is fixed.
2018-06-26 10:15:51,989 bn4b15_branch2c_gamma is fixed.
2018-06-26 10:15:51,989 bn4b15_branch2c_beta is fixed.
2018-06-26 10:15:51,989 bn4b16_branch2a_gamma is fixed.
2018-06-26 10:15:51,989 bn4b16_branch2a_beta is fixed.
2018-06-26 10:15:51,997 bn4b16_branch2b_gamma is fixed.
2018-06-26 10:15:51,998 bn4b16_branch2b_beta is fixed.
2018-06-26 10:15:51,998 bn4b16_branch2c_gamma is fixed.
2018-06-26 10:15:51,998 bn4b16_branch2c_beta is fixed.
2018-06-26 10:15:51,998 bn4b17_branch2a_gamma is fixed.
2018-06-26 10:15:51,998 bn4b17_branch2a_beta is fixed.
2018-06-26 10:15:51,998 bn4b17_branch2b_gamma is fixed.
2018-06-26 10:15:51,998 bn4b17_branch2b_beta is fixed.
2018-06-26 10:15:51,999 bn4b17_branch2c_gamma is fixed.
2018-06-26 10:15:51,999 bn4b17_branch2c_beta is fixed.
2018-06-26 10:15:51,999 bn4b18_branch2a_gamma is fixed.
2018-06-26 10:15:51,999 bn4b18_branch2a_beta is fixed.
2018-06-26 10:15:51,999 bn4b18_branch2b_gamma is fixed.
2018-06-26 10:15:51,999 bn4b18_branch2b_beta is fixed.
2018-06-26 10:15:51,999 bn4b18_branch2c_gamma is fixed.
2018-06-26 10:15:51,999 bn4b18_branch2c_beta is fixed.
2018-06-26 10:15:51,999 bn4b19_branch2a_gamma is fixed.
2018-06-26 10:15:51,999 bn4b19_branch2a_beta is fixed.
2018-06-26 10:15:51,999 bn4b19_branch2b_gamma is fixed.
2018-06-26 10:15:51,999 bn4b19_branch2b_beta is fixed.
2018-06-26 10:15:51,999 bn4b19_branch2c_gamma is fixed.
2018-06-26 10:15:51,999 bn4b19_branch2c_beta is fixed.
2018-06-26 10:15:51,999 bn4b20_branch2a_gamma is fixed.
2018-06-26 10:15:51,999 bn4b20_branch2a_beta is fixed.
2018-06-26 10:15:51,999 bn4b20_branch2b_gamma is fixed.
2018-06-26 10:15:51,999 bn4b20_branch2b_beta is fixed.
2018-06-26 10:15:51,999 bn4b20_branch2c_gamma is fixed.
2018-06-26 10:15:51,999 bn4b20_branch2c_beta is fixed.
2018-06-26 10:15:52,000 bn4b21_branch2a_gamma is fixed.
2018-06-26 10:15:52,000 bn4b21_branch2a_beta is fixed.
2018-06-26 10:15:52,000 bn4b21_branch2b_gamma is fixed.
2018-06-26 10:15:52,000 bn4b21_branch2b_beta is fixed.
2018-06-26 10:15:52,000 bn4b21_branch2c_gamma is fixed.
2018-06-26 10:15:52,000 bn4b21_branch2c_beta is fixed.
2018-06-26 10:15:52,000 bn4b22_branch2a_gamma is fixed.
2018-06-26 10:15:52,000 bn4b22_branch2a_beta is fixed.
2018-06-26 10:15:52,000 bn4b22_branch2b_gamma is fixed.
2018-06-26 10:15:52,000 bn4b22_branch2b_beta is fixed.
2018-06-26 10:15:52,000 bn4b22_branch2c_gamma is fixed.
2018-06-26 10:15:52,000 bn4b22_branch2c_beta is fixed.
2018-06-26 10:15:52,000 bn5a_branch1_gamma is fixed.
2018-06-26 10:15:52,000 bn5a_branch1_beta is fixed.
2018-06-26 10:15:52,000 bn5a_branch2a_gamma is fixed.
2018-06-26 10:15:52,000 bn5a_branch2a_beta is fixed.
2018-06-26 10:15:52,000 bn5a_branch2b_gamma is fixed.
2018-06-26 10:15:52,001 bn5a_branch2b_beta is fixed.
2018-06-26 10:15:52,001 bn5a_branch2c_gamma is fixed.
2018-06-26 10:15:52,001 bn5a_branch2c_beta is fixed.
2018-06-26 10:15:52,001 bn5b_branch2a_gamma is fixed.
2018-06-26 10:15:52,001 bn5b_branch2a_beta is fixed.
2018-06-26 10:15:52,001 bn5b_branch2b_gamma is fixed.
2018-06-26 10:15:52,001 bn5b_branch2b_beta is fixed.
2018-06-26 10:15:52,001 bn5b_branch2c_gamma is fixed.
2018-06-26 10:15:52,001 bn5b_branch2c_beta is fixed.
2018-06-26 10:15:52,002 bn5c_branch2a_gamma is fixed.
2018-06-26 10:15:52,002 bn5c_branch2a_beta is fixed.
2018-06-26 10:15:52,002 bn5c_branch2b_gamma is fixed.
2018-06-26 10:15:52,002 bn5c_branch2b_beta is fixed.
2018-06-26 10:15:52,002 bn5c_branch2c_gamma is fixed.
2018-06-26 10:15:52,002 bn5c_branch2c_beta is fixed.
2018-06-26 10:15:52,002 data is not fixed.
2018-06-26 10:15:52,003 res3a_branch1_weight is not fixed.
2018-06-26 10:15:52,003 res3a_branch2a_weight is not fixed.
2018-06-26 10:15:52,003 res3a_branch2b_weight is not fixed.
2018-06-26 10:15:52,003 res3a_branch2c_weight is not fixed.
2018-06-26 10:15:52,003 res3b1_branch2a_weight is not fixed.
2018-06-26 10:15:52,003 res3b1_branch2b_weight is not fixed.
2018-06-26 10:15:52,003 res3b1_branch2c_weight is not fixed.
2018-06-26 10:15:52,003 res3b2_branch2a_weight is not fixed.
2018-06-26 10:15:52,004 res3b2_branch2b_weight is not fixed.
2018-06-26 10:15:52,004 res3b2_branch2c_weight is not fixed.
2018-06-26 10:15:52,004 res3b3_branch2a_weight is not fixed.
2018-06-26 10:15:52,004 res3b3_branch2b_weight is not fixed.
2018-06-26 10:15:52,004 res3b3_branch2c_weight is not fixed.
2018-06-26 10:15:52,004 res4a_branch1_weight is not fixed.
2018-06-26 10:15:52,004 res4a_branch2a_weight is not fixed.
2018-06-26 10:15:52,004 res4a_branch2b_weight is not fixed.
2018-06-26 10:15:52,005 res4a_branch2c_weight is not fixed.
2018-06-26 10:15:52,005 res4b1_branch2a_weight is not fixed.
2018-06-26 10:15:52,005 res4b1_branch2b_weight is not fixed.
2018-06-26 10:15:52,005 res4b1_branch2c_weight is not fixed.
2018-06-26 10:15:52,005 res4b2_branch2a_weight is not fixed.
2018-06-26 10:15:52,005 res4b2_branch2b_weight is not fixed.
2018-06-26 10:15:52,005 res4b2_branch2c_weight is not fixed.
2018-06-26 10:15:52,005 res4b3_branch2a_weight is not fixed.
2018-06-26 10:15:52,005 res4b3_branch2b_weight is not fixed.
2018-06-26 10:15:52,006 res4b3_branch2c_weight is not fixed.
2018-06-26 10:15:52,006 res4b4_branch2a_weight is not fixed.
2018-06-26 10:15:52,006 res4b4_branch2b_weight is not fixed.
2018-06-26 10:15:52,006 res4b4_branch2c_weight is not fixed.
2018-06-26 10:15:52,006 res4b5_branch2a_weight is not fixed.
2018-06-26 10:15:52,006 res4b5_branch2b_weight is not fixed.
2018-06-26 10:15:52,006 res4b5_branch2c_weight is not fixed.
2018-06-26 10:15:52,006 res4b6_branch2a_weight is not fixed.
2018-06-26 10:15:52,006 res4b6_branch2b_weight is not fixed.
2018-06-26 10:15:52,007 res4b6_branch2c_weight is not fixed.
2018-06-26 10:15:52,007 res4b7_branch2a_weight is not fixed.
2018-06-26 10:15:52,007 res4b7_branch2b_weight is not fixed.
2018-06-26 10:15:52,007 res4b7_branch2c_weight is not fixed.
2018-06-26 10:15:52,007 res4b8_branch2a_weight is not fixed.
2018-06-26 10:15:52,007 res4b8_branch2b_weight is not fixed.
2018-06-26 10:15:52,007 res4b8_branch2c_weight is not fixed.
2018-06-26 10:15:52,007 res4b9_branch2a_weight is not fixed.
2018-06-26 10:15:52,007 res4b9_branch2b_weight is not fixed.
2018-06-26 10:15:52,007 res4b9_branch2c_weight is not fixed.
2018-06-26 10:15:52,007 res4b10_branch2a_weight is not fixed.
2018-06-26 10:15:52,007 res4b10_branch2b_weight is not fixed.
2018-06-26 10:15:52,007 res4b10_branch2c_weight is not fixed.
2018-06-26 10:15:52,007 res4b11_branch2a_weight is not fixed.
2018-06-26 10:15:52,007 res4b11_branch2b_weight is not fixed.
2018-06-26 10:15:52,007 res4b11_branch2c_weight is not fixed.
2018-06-26 10:15:52,008 res4b12_branch2a_weight is not fixed.
2018-06-26 10:15:52,008 res4b12_branch2b_weight is not fixed.
2018-06-26 10:15:52,008 res4b12_branch2c_weight is not fixed.
2018-06-26 10:15:52,008 res4b13_branch2a_weight is not fixed.
2018-06-26 10:15:52,008 res4b13_branch2b_weight is not fixed.
2018-06-26 10:15:52,008 res4b13_branch2c_weight is not fixed.
2018-06-26 10:15:52,008 res4b14_branch2a_weight is not fixed.
2018-06-26 10:15:52,008 res4b14_branch2b_weight is not fixed.
2018-06-26 10:15:52,012 res4b14_branch2c_weight is not fixed.
2018-06-26 10:15:52,012 res4b15_branch2a_weight is not fixed.
2018-06-26 10:15:52,012 res4b15_branch2b_weight is not fixed.
2018-06-26 10:15:52,012 res4b15_branch2c_weight is not fixed.
2018-06-26 10:15:52,012 res4b16_branch2a_weight is not fixed.
2018-06-26 10:15:52,012 res4b16_branch2b_weight is not fixed.
2018-06-26 10:15:52,012 res4b16_branch2c_weight is not fixed.
2018-06-26 10:15:52,013 res4b17_branch2a_weight is not fixed.
2018-06-26 10:15:52,013 res4b17_branch2b_weight is not fixed.
2018-06-26 10:15:52,013 res4b17_branch2c_weight is not fixed.
2018-06-26 10:15:52,013 res4b18_branch2a_weight is not fixed.
2018-06-26 10:15:52,013 res4b18_branch2b_weight is not fixed.
2018-06-26 10:15:52,013 res4b18_branch2c_weight is not fixed.
2018-06-26 10:15:52,013 res4b19_branch2a_weight is not fixed.
2018-06-26 10:15:52,013 res4b19_branch2b_weight is not fixed.
2018-06-26 10:15:52,013 res4b19_branch2c_weight is not fixed.
2018-06-26 10:15:52,013 res4b20_branch2a_weight is not fixed.
2018-06-26 10:15:52,013 res4b20_branch2b_weight is not fixed.
2018-06-26 10:15:52,013 res4b20_branch2c_weight is not fixed.
2018-06-26 10:15:52,013 res4b21_branch2a_weight is not fixed.
2018-06-26 10:15:52,013 res4b21_branch2b_weight is not fixed.
2018-06-26 10:15:52,013 res4b21_branch2c_weight is not fixed.
2018-06-26 10:15:52,013 res4b22_branch2a_weight is not fixed.
2018-06-26 10:15:52,014 res4b22_branch2b_weight is not fixed.
2018-06-26 10:15:52,014 res4b22_branch2c_weight is not fixed.
2018-06-26 10:15:52,014 rpn_conv_3x3_weight is not fixed.
2018-06-26 10:15:52,014 rpn_conv_3x3_bias is not fixed.
2018-06-26 10:15:52,014 rpn_cls_score_weight is not fixed.
2018-06-26 10:15:52,014 rpn_cls_score_bias is not fixed.
2018-06-26 10:15:52,014 label is not fixed.
2018-06-26 10:15:52,014 bbox_weight is not fixed.
2018-06-26 10:15:52,014 rpn_bbox_pred_weight is not fixed.
2018-06-26 10:15:52,014 rpn_bbox_pred_bias is not fixed.
2018-06-26 10:15:52,014 bbox_target is not fixed.
2018-06-26 10:15:52,014 res5a_branch1_weight is not fixed.
2018-06-26 10:15:52,014 res5a_branch2a_weight is not fixed.
2018-06-26 10:15:52,014 res5a_branch2b_weight is not fixed.
2018-06-26 10:15:52,014 res5a_branch2c_weight is not fixed.
2018-06-26 10:15:52,014 res5b_branch2a_weight is not fixed.
2018-06-26 10:15:52,014 res5b_branch2b_weight is not fixed.
2018-06-26 10:15:52,014 res5b_branch2c_weight is not fixed.
2018-06-26 10:15:52,014 res5c_branch2a_weight is not fixed.
2018-06-26 10:15:52,015 res5c_branch2b_weight is not fixed.
2018-06-26 10:15:52,015 res5c_branch2c_weight is not fixed.
2018-06-26 10:15:52,015 conv_new_1_weight is not fixed.
2018-06-26 10:15:52,015 conv_new_1_bias is not fixed.
2018-06-26 10:15:52,015 im_info is not fixed.
2018-06-26 10:15:52,015 gt_boxes is not fixed.
2018-06-26 10:15:52,015 fc_new_1_weight is not fixed.
2018-06-26 10:15:52,015 fc_new_1_bias is not fixed.
2018-06-26 10:15:52,015 pair_pos_fc1_1_weight is not fixed.
2018-06-26 10:15:52,015 pair_pos_fc1_1_bias is not fixed.
2018-06-26 10:15:52,015 query_1_weight is not fixed.
2018-06-26 10:15:52,015 query_1_bias is not fixed.
2018-06-26 10:15:52,015 key_1_weight is not fixed.
2018-06-26 10:15:52,015 key_1_bias is not fixed.
2018-06-26 10:15:52,015 linear_out_1_weight is not fixed.
2018-06-26 10:15:52,015 linear_out_1_bias is not fixed.
2018-06-26 10:15:52,015 fc_new_2_weight is not fixed.
2018-06-26 10:15:52,015 fc_new_2_bias is not fixed.
2018-06-26 10:15:52,016 pair_pos_fc1_2_weight is not fixed.
2018-06-26 10:15:52,016 pair_pos_fc1_2_bias is not fixed.
2018-06-26 10:15:52,016 query_2_weight is not fixed.
2018-06-26 10:15:52,016 query_2_bias is not fixed.
2018-06-26 10:15:52,016 key_2_weight is not fixed.
2018-06-26 10:15:52,016 key_2_bias is not fixed.
2018-06-26 10:15:52,016 linear_out_2_weight is not fixed.
2018-06-26 10:15:52,016 linear_out_2_bias is not fixed.
2018-06-26 10:15:52,016 cls_score_weight is not fixed.
2018-06-26 10:15:52,016 cls_score_bias is not fixed.
2018-06-26 10:15:52,016 bbox_pred_weight is not fixed.
2018-06-26 10:15:52,016 bbox_pred_bias is not fixed.
2018-06-26 10:15:52,016 roi_feat_embedding_weight is not fixed.
2018-06-26 10:15:52,016 roi_feat_embedding_bias is not fixed.
2018-06-26 10:15:52,020 nms_rank_weight is not fixed.
2018-06-26 10:15:52,020 nms_rank_bias is not fixed.
2018-06-26 10:15:52,020 nms_pair_pos_fc1_1_weight is not fixed.
2018-06-26 10:15:52,020 nms_pair_pos_fc1_1_bias is not fixed.
2018-06-26 10:15:52,020 nms_query_1_weight is not fixed.
2018-06-26 10:15:52,020 nms_query_1_bias is not fixed.
2018-06-26 10:15:52,020 nms_key_1_weight is not fixed.
2018-06-26 10:15:52,021 nms_key_1_bias is not fixed.
2018-06-26 10:15:52,021 nms_linear_out_1_weight is not fixed.
2018-06-26 10:15:52,021 nms_linear_out_1_bias is not fixed.
2018-06-26 10:15:52,021 nms_logit_weight is not fixed.
2018-06-26 10:15:52,021 nms_logit_bias is not fixed.
when I use my own dataset to train the model, in the terminal I input python experiments/relation_rcnn/rcnn_end2end_train_test.py --cfg experiments/relation_rcnn/cfgs/resnet_v1_101_coco_trainvalminus_rcnn_end2end_relation_learn_nms_8epoch.yaml --ignore_cache , there are some problems. Can you tell me the reason? Thanks in advance.
Did someone run the code? Where is the relationship module code?
How do you draw the picture of [ Object Pairs with High Relation Weights] and [ Class Co-Occurrence Information is Learnt ] in your slides of CVPR2018
When try to run this network on the VOC datasets, I use the VOC datasets IO function from the Deformable ConvNets (https://github.com/msracver/Deformable-ConvNets)
but always find the problem with the function sys_instance.infer_shape() which in the ./relation_rcnn/train_end2end.py line 99 ,I track this problem to the file "my_mxnet_root/symbol/symbol.py", line 1119, in _infer_shape_impl ctypes.byref(complete), where check_call find error in operator slice_axis1: check failed: (*end <= axis_size) && (*end >=0) invalid begin, end, get begin=0, end =300
If someone knows how to solve this problem, or has done the implementation of this algorithm on the VOC dataset, please help me.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.