-
Notifications
You must be signed in to change notification settings - Fork 86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reading TFlite model metadata #78
Comments
On first glance |
[offtopic] +1 for "hipster tooling" ;-) [/offtopic] |
Got a prototype metadata reader working in my tree: https://github.com/phlash/backscrub/tree/tflite-metadata Not pretty but avoids pulling in a whole new library and 'hipster' build system 😉 |
Maybe also a bit off topic in regards to meta data reading; Tried to use that hipster build system for a standalone lib that uses TensorFlow. Turned out to be a huge time sink.. Goal: just make some quick experiments with post processing of model output in Python
Got it to build for my use case in the end. Tried to make a quick adaption of Bazel for backscrub but never managed to find a good way to integrate opencv dep. Some take home messages:
1: tensorflow/tensorflow#44043 |
[still OT slightly] Thanks for trying the 'hipster way' 😄. I'm not sure CMake is going away now (it may have been then), as it's properly documented and marked as 'experimental since 2.4' here: https://www.tensorflow.org/lite/guide/build_cmake. I for one would rather use CMake for it's good documentation, popularity and capability (even if I don't like the mess it spews out!). It took me just a few minutes to get a basic 'backscrub + tensorflow' combined build working, so I could enable XNNPACK and double the CPU-based performance: https://github.com/phlash/backscrub/tree/xnnpack-test [back on topic] Thoughts on my rough metadata extraction hack? It could/should probably use [de]serialisation code generated by |
Does the |
Google do exactly that for parts of the build already, pre-built headers from |
There seems to be a package |
Ah-ha! Also in buster-backports so this might be an easy way out. That said flatc builds easily enough with CMake, unfortunately it's a struggle to do that via the main tensorflow-lite build (the dependency mechanism using |
OK, it looks neater with the compiled metadata serializer: https://github.com/phlash/backscrub/tree/tflite-metadata This branch is now:
Awaiting somewhere to use the metadata values... coming in: #77 then I'll PR this work |
Originally posted by @phlash in #77 (comment)
Not in the models we're currently using, but only because we didn't need it. Models with metadata in are available from the Google model zoo (https://tfhub.dev/). [edit] I lied, that only contains Deeplabv3 with metadata. Looks like the Media Pipe team haven't added any yet, although they do have model cards. Oh well. end
I'm currently looking at sane ways to read the metadata without adding a new dependency and build pain (it needs Bazel) for the
tf_lite_support
library that's officially required. So far, getting a blob of metadata out works ok (see snippet below), since TfLite already supports random blobs in a file, however that blob then needs parsing (fromflatbuffers
format, schema here: https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/metadata/metadata_schema.fbs) to pull out the input normalization constants.. Currently poking through this: https://google.github.io/flatbuffers/md__internals.html and a hex dump of the raw buffer.This might turn out to be horribly fragile though!
Question: What do we think about using Bazel for builds (standard Tensorflow tooling)?
The text was updated successfully, but these errors were encountered: