Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TFJS support model.json to ONNX conversion #2097

Open
JohnRSim opened this issue Nov 18, 2024 · 0 comments
Open

TFJS support model.json to ONNX conversion #2097

JohnRSim opened this issue Nov 18, 2024 · 0 comments
Labels
exporters Issue related to exporters tflite

Comments

@JohnRSim
Copy link

Feature request

Currently using node to create an image-classifier model.json with tfjs

  • I don't think Optimum support this format to convert to onnx?

It would be nice to just use optimum and point to model.json.

Motivation

Currently I'm creating the model converting it to graph and then converting to onnx like this -

tensorflowjs_converter --input_format=tfjs_layers_model \  --output_format=tfjs_graph_model \  ./saved-model/layers-model/model.json \  ./saved-model/graph-model
python3 -m tf2onnx.convert --tfjs ./saved-model/graph-model/model.json --output ./saved-model/model.onnx

I'm not sure how to switch to use optimum - do I need to convert model.json to .h5 and then run?

  • if I try this I run into huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': './path_to_save/model.h5'. Use repo_type argument if needed

Your contribution

N/A

@IlyasMoutawwakil IlyasMoutawwakil added exporters Issue related to exporters tflite labels Nov 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
exporters Issue related to exporters tflite
Projects
None yet
Development

No branches or pull requests

2 participants