Skip to content

combining models and tflite export #84

Answered by leondgarse
avber asked this question in Q&A
Discussion options

You must be logged in to vote
  • DecodePredictions as a layer is supported. Usage has been updated in Readme#tflite-conversion. Also some tests can be found in colab kecam_test.ipynb, including saving model as saved_mode and TFLite conversion usage. The key parameter is use_static_output=True, that sets layer output shape fixed as [batch, max_output_size, 6], for TFLite seems not supporting dynamic shape on the second dimension.
  • Parameters like score_threshold / iou_or_sigma / max_output_size still being custom while using model.decode_predictions(...) directly, and when converting TFLite, they can be set new value, like score_threshold usage in Readme#tflite-conversion. But after converting, I'm not sure how to set the…

Replies: 3 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@avber
Comment options

Answer selected by avber
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants