Giter Site home page Giter Site logo

RT-DETR PyTorch - ShapeInferenceError: Cannot export with --simplify for earlier versions of deepstream 6.0 about deepstream-yolo HOT 16 OPEN

pullmyleg avatar pullmyleg commented on June 11, 2024
RT-DETR PyTorch - ShapeInferenceError: Cannot export with --simplify for earlier versions of deepstream 6.0

from deepstream-yolo.

Comments (16)

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024 2

the author of RT-DETR told me that we need to upgrade our TensorRT version >= 8.5.1 to support some operator in rt-detr

from deepstream-yolo.

pullmyleg avatar pullmyleg commented on June 11, 2024 2

Thanks for sharing @IronmanVsThanos!

I think we will need to use a later version of Deepstream. x86 uses a later version of cuda (11.4) which is why it is working and Jetson with Cuda 10.2 won't support this.

from deepstream-yolo.

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024 1

你好@IronmanVsThanos,导出后模型仍可成功在 deepstream X86 上运行,但不会在 jetson 设备上运行。

稍后将发布更多详细信息。但导出的模型有错误仍然有效。

thank u for your replay,When I use the comand "python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --dynamic --simplify" , i can create the onnx model successfull,but it still not work on deepstream 6.0.1. and the following error occurs.

WARNING: [TRT]: onnx2trt_utils.cpp:366: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
WARNING: [TRT]: onnx2trt_utils.cpp:392: One or more weights outside the range of INT32 was clamped
ERROR: [TRT]: ModelImporter.cpp:773: While parsing node number 444 [GridSample -> "/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0"]:
ERROR: [TRT]: ModelImporter.cpp:774: --- Begin node ---
ERROR: [TRT]: ModelImporter.cpp:775: input: "/0/decoder/decoder/layers.0/cross_attn/Reshape_5_output_0"
input: "/0/decoder/decoder/layers.0/cross_attn/Reshape_6_output_0"
output: "/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0"
name: "/0/decoder/decoder/layers.0/cross_attn/GridSample"
op_type: "GridSample"
attribute {
name: "align_corners"
i: 0
type: INT
}
attribute {
name: "mode"
s: "bilinear"
type: STRING
}
attribute {
name: "padding_mode"
s: "zeros"
type: STRING
}
ERROR: [TRT]: ModelImporter.cpp:776: --- End node ---
ERROR: [TRT]: ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4870 In function importFallbackPluginImporter:
[8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"
ERROR: Failed to parse onnx file
ERROR: failed to build network since parsing model errors.

from deepstream-yolo.

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024

您好,感谢您为回购所做的所有工作@marcoslucianops。我在使用 --simplify 标签为早期版本的 deepstream (6.0.1) 导出 RT-DETR pytorch 模型时注意到一个问题。

具体来说,错误 [ShapeInferenceError] 推断形状和现有形状的等级不同:(3) 与 (0) 表示乘法(“Mul”)运算中涉及的张量的预期维度不匹配。

看起来这里有同样的问题:onnx/onnx#3565,它也成功生成了 .onnx 文件。

问题:

  1. 该模型是否可以支持旧版本 Deepstream(例如 6.0.1)的简化标志?
  2. 你能提出修复建议吗@marcoslucianops

错误:

$ python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --simplify -s 640

Starting: output/rtdetr_r18vd_6x_coco/checkpoint0013.pth
Opening RT-DETR PyTorch model

Load PResNet18 state_dict

Exporting the model to ONNX
============= Diagnostic Run torch.onnx.export version 2.0.1+cu117 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

Simplifying the ONNX model
Traceback (most recent call last):
  File "export_rtdetr_pytorch.py", line 110, in <module>
    sys.exit(main(args))
  File "export_rtdetr_pytorch.py", line 83, in main
    model_onnx, _ = onnxsim.simplify(model_onnx)
  File "/home/inviol/.virtualenvs/RTDETR_pytorch/lib/python3.8/site-packages/onnxsim/onnx_simplifier.py", line 199, in simplify
    model_opt_bytes = C.simplify(
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [ShapeInferenceError] (op_type:Mul, node name: /1/Mul): [ShapeInferenceError] Inferred shape and existing shape differ in rank: (3) vs (0) 

当使用 --dynamic 标志导出更高版本时没有问题时工作:

$ python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --dynamic

Starting: output/rtdetr_r18vd_6x_coco/checkpoint0013.pth
Opening RT-DETR PyTorch model

Load PResNet18 state_dict

Exporting the model to ONNX
============= Diagnostic Run torch.onnx.export version 2.0.1+cu117 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

Done: checkpoint0013.onnx

Hello i have the same question with U , has you slove this problem?

from deepstream-yolo.

pullmyleg avatar pullmyleg commented on June 11, 2024

Hi @IronmanVsThanos, post export the model still converts runs on deepstream X86 succesfully but will not run on a jetson unit.

Will post more detail on this later. But the exported model with error still works.

from deepstream-yolo.

pullmyleg avatar pullmyleg commented on June 11, 2024

Your export command looks incorrect. --dynamic (this is for later versions of deepstream) and --simplify (older versions) is not right I don't think. Try:

python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --simplify -s 640

from deepstream-yolo.

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024

Thank U for your reply!As you suggested, I used the command "python3 export_rtdetr_pytorch.py -w output/rtdetr_r18vd_6x_coco/checkpoint0013.pth -c configs/rtdetr/rtdetr_r18vd_6x_coco.yml --simplify -s 640" to export onnx on a server that trains rt-detr, but I get the following error:

`Opening RT-DETR PyTorch model

Load PResNet18 state_dict

Exporting the model to ONNX
============= Diagnostic Run torch.onnx.export version 2.0.1+cu117 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================

Simplifying the ONNX model
Traceback (most recent call last):
File "export_rtdetr_pytorch.py", line 110, in
sys.exit(main(args))
File "export_rtdetr_pytorch.py", line 83, in main
model_onnx, _ = onnxsim.simplify(model_onnx)
File "/home/inviol/.virtualenvs/RTDETR_pytorch/lib/python3.8/site-packages/onnxsim/onnx_simplifier.py", line 199, in simplify
model_opt_bytes = C.simplify(
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [ShapeInferenceError] (op_type:Mul, node name: /1/Mul): [ShapeInferenceError] Inferred shape and existing shape differ in rank: (3) vs (0)`

from deepstream-yolo.

pullmyleg avatar pullmyleg commented on June 11, 2024

Yes, I get the same error but the model still runs in deepstream and converts (x86 only). It does not work on a Jetson unit.

from deepstream-yolo.

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024

Do you have the same environment for x86 and deepstream?

from deepstream-yolo.

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024

when u use the onnx model on jetson unit,is that the error like the follow?
WARNING: [TRT]: onnx2trt_utils.cpp:366: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. WARNING: [TRT]: onnx2trt_utils.cpp:392: One or more weights outside the range of INT32 was clamped ERROR: [TRT]: ModelImporter.cpp:773: While parsing node number 444 [GridSample -> "/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0"]: ERROR: [TRT]: ModelImporter.cpp:774: --- Begin node --- ERROR: [TRT]: ModelImporter.cpp:775: input: "/0/decoder/decoder/layers.0/cross_attn/Reshape_5_output_0" input: "/0/decoder/decoder/layers.0/cross_attn/Reshape_6_output_0" output: "/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0" name: "/0/decoder/decoder/layers.0/cross_attn/GridSample" op_type: "GridSample" attribute { name: "align_corners" i: 0 type: INT } attribute { name: "mode" s: "bilinear" type: STRING } attribute { name: "padding_mode" s: "zeros" type: STRING } ERROR: [TRT]: ModelImporter.cpp:776: --- End node --- ERROR: [TRT]: ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4870 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?" ERROR: Failed to parse onnx file ERROR: failed to build network since parsing model errors.

by the way,have U tried the Ultrlytics rt-detr?

from deepstream-yolo.

pullmyleg avatar pullmyleg commented on June 11, 2024

Do you have the same environment for x86 and deepstream?

  1. Yes - in x86 environment (deepstream 6.0.1) it works without any issues.
  2. Have not tried Ultralytics, but they only support L & XL models. Have you tried Ultralytics?
  3. I am going to try the Paddle version. Have you tried this?
  4. Yes - Same error on jetson:
ERROR: [TRT]: ModelImporter.cpp:774: --- Begin node ---
ERROR: [TRT]: ModelImporter.cpp:775: input: "/0/decoder/decoder/layers.0/cross_attn/Reshape_5_output_0"
input: "/0/decoder/decoder/layers.0/cross_attn/Reshape_6_output_0"
output: "/0/decoder/decoder/layers.0/cross_attn/GridSample_output_0"
name: "/0/decoder/decoder/layers.0/cross_attn/GridSample"
op_type: "GridSample"
attribute {
  name: "align_corners"
  i: 0
  type: INT
}
attribute {
  name: "mode"
  s: "bilinear"
  type: STRING
}
attribute {
  name: "padding_mode"
  s: "zeros"
  type: STRING
} ```

from deepstream-yolo.

pullmyleg avatar pullmyleg commented on June 11, 2024

Smaller options is available: https://github.com/orgs/ultralytics/discussions/2545

from deepstream-yolo.

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024

1.Have not tried Ultralytics, but they only support L & XL models. Have you tried Ultralytics? I tried the L model with onnx opset version 12,but I get the following error:

(yolov8) hx@FitServer-R4200-V5:/mnt/sda1/Deep_learning/code/yolov8/ultralytics$ python3 export_rtdetr_ultralytics.py -w rtdetr-l.pt --simplify

Starting: rtdetr-l.pt
Opening RT-DETR Ultralytics model

Ultralytics YOLOv8.0.155 Python-3.8.18 torch-1.9.1+cu111 CPU (Intel Xeon Gold 6148 2.40GHz)
rt-detr-l summary: 494 layers, 32148140 parameters, 0 gradients

Creating labels.txt file

Exporting the model to ONNX
Traceback (most recent call last):
File "export_rtdetr_ultralytics.py", line 124, in
sys.exit(main(args))
File "export_rtdetr_ultralytics.py", line 92, in main
torch.onnx.export(model, onnx_input_im, onnx_output_file, verbose=False, opset_version=args.opset,
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/onnx/init.py", line 275, in export
return utils.export(model, args, f, export_params, verbose, training,
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/onnx/utils.py", line 88, in export
_export(model, args, f, export_params, verbose, training, input_names, output_names,
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/onnx/utils.py", line 689, in _export_model_to_graph(model, args, verbose, input_names,
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/onnx/utils.py", line 458, in _model_to_graph
graph, params, torch_out, module = _create_jit_graph(model, args,
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/onnx/utils.py", line 422, in _create_jit_graph
graph, torch_out = _trace_and_get_graph_from_model(model, args)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/onnx/utils.py", line 373, in _trace_and_get_graph_from_model
torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/jit/_trace.py", line 1160, in _get_trace_graph
outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/jit/_trace.py", line 127, in forward
graph, out = torch._C._create_graph_by_tracing(
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/jit/_trace.py", line 118, in wrapper
outs.append(self.inner(*trace_inputs))
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1039, in _slow_forward
result = self.forward(*input, **kwargs)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/nn/modules/container.py", line 139, in forward
input = module(input)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1039, in _slow_forward
result = self.forward(*input, **kwargs)
File "/mnt/sda1/Deep_learning/code/yolov8/ultralytics/ultralytics/nn/tasks.py", line 45, in forward
return self.predict(x, *args, **kwargs)
File "/mnt/sda1/Deep_learning/code/yolov8/ultralytics/ultralytics/nn/tasks.py", line 62, in predict
return self._predict_once(x, profile, visualize)
File "/mnt/sda1/Deep_learning/code/yolov8/ultralytics/ultralytics/nn/tasks.py", line 82, in _predict_once
x = m(x) # run
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1051, in _call_impl
return forward_call(*input, **kwargs)
File "/home/hx/anaconda3/envs/yolov8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1039, in _slow_forward
result = self.forward(*input, **kwargs)
File "/mnt/sda1/Deep_learning/code/yolov8/ultralytics/ultralytics/nn/modules/transformer.py", line 81, in forward
pos_embed = self.build_2d_sincos_position_embedding(w, h, c)
File "/mnt/sda1/Deep_learning/code/yolov8/ultralytics/ultralytics/nn/modules/transformer.py", line 90, in build_2d_sincos_position_embedding
grid_w, grid_h = torch.meshgrid(grid_w, grid_h, indexing='ij')
TypeError: meshgrid() got an unexpected keyword argument 'indexing'
`
2. I am going to try the Paddle version. Have you tried this? no ,did it works?
3. when i use ultralytics yoloV8,i set onnx opset version as 12 it works on jetson

from deepstream-yolo.

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024

yes,I agree with you. 6.0.1 is too late,by the way,the author of RT-DETR told me that they will release RT-DETRV2 and this version does not rely much on the Tensor RT version

from deepstream-yolo.

pullmyleg avatar pullmyleg commented on June 11, 2024

Thanks exciting. @IronmanVsThanos did he mention an estimated release date?

from deepstream-yolo.

IronmanVsThanos avatar IronmanVsThanos commented on June 11, 2024

He did not mention a specific time, but said that it would be in the near future.

from deepstream-yolo.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.