Skip to content

RCNN models from mmdetection fails InferenceSession() #31

@masashikudo910

Description

@masashikudo910

Description

The following models from mmdetection in OpenMMLab were able to perform InferenceSession() early October. However, they fail to do so as of now with something like Fail: [ONNXRuntimeError] : 1 : FAIL : Node (/Reshape_65) Op (Reshape) [ShapeInferenceError] Dimension could not be inferred: incompatible shapes error.

  • faster_rcnn_r50_fpn
  • faster_rcnn_r50_pafpn
  • faster_rcnn_r101_fpn
  • mask_rcnn_r50_fpn
  • mask_rcnn_r101_fpn

Also the following model which was added to 24.09 started to fail InferenceSession as well.

  • dynamic_rcnn_r50_fpn

Versions of ONNX related packages

$ pip freeze |grep onnx
onnx==1.16.0
onnxconverter-common==1.13.0
onnxoptimizer==0.3.1
onnxruntime @ https://github.com/quadric-io/onnxruntime/releases/download/v23/onnxruntime-1.20.0-cp310-cp310-linux_x86_64.whl#sha256=b9ecc6f80adf452488311b069c5ea3bb3def274b4ac1002e7a282395be3e7231
onnxsim==0.4.36
tf2onnx==1.8.4

How to Reproduce

  • To export onnx
    Please perform the Jupyter notebook (sdk-cli/examples/models/zoo/detectors_zoo/mmdetection.ipynb) by changing the model name to faster_rcnn_r50_fpn as follows at the Select Model section.
MODEL_NAME = MMDetModelVariant.faster_rcnn_r50_fpn

Then, the onnx is exported at the Export ONNX section.

  • To reproduce the error
    Please perform the following.
import onnx
from onnxruntime import InferenceSession

onnx_file = 'faster_rcnn_r50_fpn.onnx'

session = InferenceSession(onnx.load(onnx_file).SerializeToString())

It will cause the following error.

Fail: [ONNXRuntimeError] : 1 : FAIL : Node (/Reshape_65) Op (Reshape) [ShapeInferenceError] Dimension could not be inferred: incompatible shapes

Resources

The following is an example of the onnx files exported from mmdetection and hit ShapeInferenceError.

The following is the onnx head created in October. This was split from the exported onnx as a head in October. This includes /Reshape_65 which causes ShapeInferenceError with faster_rcnn_r50_fpn.onnx above. This can perform InferenceSession without problems.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions