This is a very basic example wich demonstrates how to load and infer a Frozen Graph created using TensorFlow. Frozen GraphDef is the standard of storing a graph for versions of TensorFlow < 1.15. We wanted to support this feature for legacy reasons.
We will create a dummy graph for demonstration purposes. To create the model simply run the following command:
python main.py
By default ofxTF2::Model
will use the SavedModel
format. To use the frozen GraphDef format you may either add the type to the constructor or call setModelType()
afterwards.
As the default names differ from the names in the SavedModel
format make sure to overwrite names of the ins and outs by calling setup()
.
// set model type and i/o names
model.setModelType(cppflow::model::TYPE::FROZEN_GRAPH);
model.setup({{"x:0"}}, {{"Identity:0"}});
Afterwards you can load the pb file using the load()
function.
// load the model, bail out on error
if(!model.load("model.pb")) {
std::exit(EXIT_FAILURE);
}
Everything else should work the same.
Please understand that we wont be able to invest a lot of time in supporting this feature in the future.