Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix optimum-cli export executorch #2172

Closed
1 of 4 tasks
guangy10 opened this issue Jan 28, 2025 · 1 comment
Closed
1 of 4 tasks

Fix optimum-cli export executorch #2172

guangy10 opened this issue Jan 28, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@guangy10
Copy link
Contributor

System Info

Issue:

Run the following cmd on latest optimum
`optimum-cli export executorch --model hf-internal-testing/tiny-random-LlamaForCausalLM --task text-generation --recipe xnnpack --output_dir meta_llama3_2_1b`
will get errors:

usage: optimum-cli export [-h] {onnx,tflite} ...
optimum-cli export: error: argument {onnx,tflite}: invalid choice: 'executorch' (choose from 'onnx', 'tflite')


This is a regression due to the repo migration. We should add executorch back to the CLI.

Who can help?

@echarlaix

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction (minimal, reproducible, runnable)

Simple run optimum-cli export executorch --help can reproduce

Expected behavior

The CLI should export the model to ExecuTorch

@echarlaix
Copy link
Collaborator

Discussed in huggingface/optimum-executorch#7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants