I am using the prebuilt docker to convert a Caffe model to ONNX and i faced this problem when trying with my model or the Alexnet model for testing.
`RuntimeError Traceback (most recent call last)
in
1 # Convert Caffe model to CoreML
----> 2 coreml_model = coremltools.converters.caffe.convert((caffe_model, proto_file))
3
4 # Save CoreML model
5 coreml_model.save(output_coreml_model)
/usr/local/lib/python3.5/dist-packages/coremltools/converters/caffe/_caffe_converter.py in convert(model, image_input_names, is_bgr, red_bias, blue_bias, green_bias, gray_bias, image_scale, class_labels, predicted_feature_name, model_precision)
189 blue_bias,
190 green_bias, gray_bias, image_scale, class_labels,
--> 191 predicted_feature_name)
192 model = MLModel(model_path)
193
/usr/local/lib/python3.5/dist-packages/coremltools/converters/caffe/_caffe_converter.py in _export(filename, model, image_input_names, is_bgr, red_bias, blue_bias, green_bias, gray_bias, image_scale, class_labels, predicted_feature_name)
253 prototxt_path,
254 class_labels,
--> 255 predicted_feature_name)
RuntimeError: Unable to open caffe model provided in the source model path: alex1.caffemodel`
I am :
1- runing ubuntu 16.04 (Virtual box) on windows 10.
2- Tried to rename the caffe model and prototxt to ' model, model1' and ' model.caffemodel, model .ptrototxt' . Also, I already put two copies of each network in Scripts and /scripts /converter-scripts and still doesnt work.
this is how I implemented the code:
`import coremltools
import onnxmltools
Update your input name and path for your caffe model
proto_file = 'alex.prototxt'
input_caffe_path = 'alex1.caffemodel'
Update the output name and path for intermediate coreml model, or leave as is
output_coreml_model = 'model.mlmodel'
Change this path to the output name and path for the onnx model
output_onnx_model = 'model.onnx'
Convert Caffe model to CoreML
coreml_model = coremltools.converters.caffe.convert((input_caffe_path, proto_file))
Save CoreML model
coreml_model.save(output_coreml_model)
Load a Core ML model
coreml_model = coremltools.utils.load_spec(output_coreml_model)
Convert the Core ML model into ONNX
onnx_model = onnxmltools.convert_coreml(coreml_model)
Save as protobuf
onnxmltools.utils.save_model(onnx_model, output_onnx_model)`
i hope you have seen such a thing before and can help me .