Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to run executorch export from CLI #7

Closed
guangy10 opened this issue Jan 28, 2025 · 9 comments
Closed

Unable to run executorch export from CLI #7

guangy10 opened this issue Jan 28, 2025 · 9 comments
Assignees

Comments

@guangy10
Copy link
Collaborator

Before moving this new repo, we can export a model to ExecuTorch via CLI like this:
optimum-cli export executorch --model "meta-llama/Llama-3.2-1B" --task "text-generation" --recipe "xnnpack" --output_dir="meta_llama3_2_1b"
@echarlaix can you point what are the updated way to run the CLI?

@echarlaix
Copy link
Collaborator

echarlaix commented Jan 28, 2025

Hi @guangy10, could you share the issue you're facing ? I'm able to export a model with the CLI without any issue. Could you confirm that optimum-cli export executorch --model hf-internal-testing/tiny-random-LlamaForCausalLM --task text-generation --recipe xnnpack --output_dir meta_llama3_2_1b doesn't run on your end ? To install optimum-executorch you can do pip install .

@guangy10
Copy link
Collaborator Author

guangy10 commented Jan 28, 2025

@echarlaix You probably have to reproduce this issue by setting up the repo from scratch.

With a fresh python env setup, I prefer using conda, so I did these:

conda create -yn optimum-executorch python=3.11
conda activate optimum-executorch
pip install --upgrade-strategy eager optimum[executorch]

After pip install I have:

Package            Version
------------------ -----------
aiohappyeyeballs   2.4.4
aiohttp            3.11.11
aiosignal          1.3.2
attrs              25.1.0
certifi            2024.12.14
charset-normalizer 3.4.1
coloredlogs        15.0.1
datasets           3.2.0
dill               0.3.8
filelock           3.17.0
frozenlist         1.5.0
fsspec             2024.9.0
huggingface-hub    0.28.0
humanfriendly      10.0
idna               3.10
Jinja2             3.1.5
MarkupSafe         3.0.2
mpmath             1.3.0
multidict          6.1.0
multiprocess       0.70.16
networkx           3.4.2
numpy              2.2.2
optimum            1.23.3
packaging          24.2
pandas             2.2.3
pip                25.0
propcache          0.2.1
pyarrow            19.0.0
python-dateutil    2.9.0.post0
pytz               2024.2
PyYAML             6.0.2
regex              2024.11.6
requests           2.32.3
safetensors        0.5.2
setuptools         75.8.0
six                1.17.0
sympy              1.13.1
tokenizers         0.21.0
torch              2.5.1
tqdm               4.67.1
transformers       4.48.1
typing_extensions  4.12.2
tzdata             2025.1
urllib3            2.3.0
wheel              0.44.0
xxhash             3.5.0
yarl               1.18.3

Then run the optimum-cli,
optimum-cli export executorch --model "meta-llama/Llama-3.2-1B" --task "text-generation" --recipe "xnnpack" --output_dir="meta_llama3_2_1b"

usage: optimum-cli export [-h] {onnx,tflite} ...
optimum-cli export: error: argument {onnx,tflite}: invalid choice: 'executorch' (choose from 'onnx', 'tflite')

I think the problem is that it installed the optimum 1.23.3 which is released long time ago and doesn't have the executorch option. To fix this path, should we considering making a new optimum release?

@guangy10
Copy link
Collaborator Author

@echarlaix For another path, setting up the repo in dev mode, I have:

git clone https://github.com/huggingface/optimum-executorch.git
cd optimum-executorch
pip install -e .

After the installation I got:

Package            Version     Editable project location
------------------ ----------- -----------------------------------
attrs              25.1.0
certifi            2024.12.14
charset-normalizer 3.4.1
execnet            2.1.1
executorch         0.4.0
expecttest         0.3.0
filelock           3.17.0
flatbuffers        25.1.24
fsspec             2024.12.0
huggingface-hub    0.28.0
hypothesis         6.124.7
idna               3.10
iniconfig          2.0.0
Jinja2             3.1.5
MarkupSafe         3.0.2
mpmath             1.3.0
networkx           3.4.2
numpy              1.23.2
optimum            1.24.0.dev0
optimum-executorch 0.0.1       /Users/guangyang/optimum-executorch
packaging          24.2
pandas             2.2.3
parameterized      0.9.0
pillow             11.1.0
pip                25.0
pluggy             1.5.0
pytest             8.3.4
pytest-xdist       3.6.1
python-dateutil    2.9.0.post0
pytz               2024.2
PyYAML             6.0.2
regex              2024.11.6
requests           2.32.3
ruamel.yaml        0.18.10
ruamel.yaml.clib   0.2.12
safetensors        0.5.2
setuptools         75.8.0
six                1.17.0
sortedcontainers   2.4.0
sympy              1.13.1
tabulate           0.9.0
tokenizers         0.21.0
torch              2.5.0
torchaudio         2.5.0
torchvision        0.20.0
tqdm               4.67.1
transformers       4.48.1
typing_extensions  4.12.2
tzdata             2025.1
urllib3            2.3.0
wheel              0.44.0

Note that the optimum is already pointing to latest trunk.

Then run the optimum-cli ended up with same error:
optimum-cli export executorch --model "meta-llama/Llama-3.2-1B" --task "text-generation" --recipe "xnnpack" --output_dir="meta_llama3_2_1b"

usage: optimum-cli export [-h] {onnx,tflite} ...
optimum-cli export: error: argument {onnx,tflite}: invalid choice: 'executorch' (choose from 'onnx', 'tflite')

It looks like the executorch option is mistakenly removed from the optimum cli.

@guangy10
Copy link
Collaborator Author

Link to the GitHub Issue in Optimum: huggingface/optimum#2172

@guangy10
Copy link
Collaborator Author

@echarlaix Actually I tried the CLI with optimum-intel and ended up with the same error:
optimum-cli export openvino --help

usage: optimum-cli export [-h] {onnx,tflite} ...
optimum-cli export: error: argument {onnx,tflite}: invalid choice: 'openvino' (choose from 'onnx', 'tflite')

It could be an issue affecting more repos, or simply because I missed something silly.

@IlyasMoutawwakil
Copy link
Member

you can't install optimum-executorch from pypi as optimum[executorch], because optimum with executorch extra was never released.

@guangy10
Copy link
Collaborator Author

you can't install optimum-executorch from pypi as optimum[executorch], because optimum with executorch extra was never released.

Should the dev mode described here work? cc: @echarlaix

@echarlaix
Copy link
Collaborator

Should the dev mode described #7 (comment) work? cc: @echarlaix

I think installing it in editable mode won't work, would replace pip install -e . with pip install .

@echarlaix echarlaix self-assigned this Jan 31, 2025
@guangy10
Copy link
Collaborator Author

Should the dev mode described #7 (comment) work? cc: @echarlaix

I think installing it in editable mode won't work, would replace pip install -e . with pip install .

Thanks @echarlaix It works. Will update the instructions in #10

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants