-
Notifications
You must be signed in to change notification settings - Fork 487
Issues: huggingface/optimum
Community contribution -
optimum.exporters.onnx
support fo...
#555
opened Dec 7, 2022 by
michaelbenayoun
Open
41
Community contribution -
BetterTransformer
integration for ...
#488
opened Nov 18, 2022 by
younesbelkada
Open
25
[Quick poll] Give your opinion on the future of the Hugging F...
#568
opened Dec 9, 2022 by
LysandreJik
Open
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
install instructions result is pip version conflicts.
bug
Something isn't working
#2125
opened Dec 12, 2024 by
hpcpony
4 tasks
Support for Converting Sentence-Transformers to TFLite
feature-request
New feature or request
tflite
#2124
opened Dec 10, 2024 by
adityasahugit
1 of 4 tasks
GPTQ kernel inference not compatible with some models
bug
Something isn't working
#2120
opened Dec 7, 2024 by
Qubitium
2 of 4 tasks
TFJS support model.json to ONNX conversion
exporters
Issue related to exporters
tflite
#2097
opened Nov 18, 2024 by
JohnRSim
Add support for Musicgen Melody in the ONNX export
onnx
Related to the ONNX export
#2095
opened Nov 13, 2024 by
rubeniskov
Support onnx conversion for wav2vec2-bert
bug
Something isn't working
#2082
opened Oct 27, 2024 by
fawazahmed0
2 of 4 tasks
"ValueError: Trying to export a codesage model" while trying to export codesage/codesage-large
bug
Something isn't working
#2080
opened Oct 25, 2024 by
TurboEncabulator9000
1 of 4 tasks
LLama 3.2 vision - unable to convert
bug
Something isn't working
#2079
opened Oct 24, 2024 by
pdufour
4 tasks
Problem converting tinyllama to onnx model with optimum-cli
bug
Something isn't working
#2076
opened Oct 22, 2024 by
hayyaw
2 of 4 tasks
Problem converting DeBERTaV3 to ONNX using optimum-cli
bug
Something isn't working
#2075
opened Oct 21, 2024 by
marcovzla
2 of 4 tasks
Conversion innaccuracy specific Opus-MT model
bug
Something isn't working
#2068
opened Oct 18, 2024 by
FricoRico
2 of 4 tasks
vision model's input size spedified with cmd line is overrided by pretrained model config
exporters
Issue related to exporters
onnx
Related to the ONNX export
#2035
opened Sep 29, 2024 by
waterdropw
[ONNX] Use the Related to the ONNX export
dynamo=True
option from PyTorch 2.5
onnx
#2026
opened Sep 17, 2024 by
justinchuby
Adding Support for DETA Model
onnx
Related to the ONNX export
#2018
opened Sep 8, 2024 by
TheMattBin
[Feature request] Add kwargs or additional options for torch.onnx.export
onnx
Related to the ONNX export
#2009
opened Sep 3, 2024 by
martinkorelic
Optional Related to the ONNX export
subfolder
if model repository contains one ONNX model behind a subfolder
onnx
#2008
opened Sep 3, 2024 by
tomaarsen
Support for gemma2-2b-it(gemma 2nd version) Model Export in Optimum for OpenVINO
onnx
Related to the ONNX export
#2006
opened Sep 3, 2024 by
chakka12345677
support for jinaai/jina-reranker-v2-base-multilingual model
bug
Something isn't working
onnxruntime
Related to ONNX Runtime
#2004
opened Aug 30, 2024 by
bash99
2 of 4 tasks
Is it possible to infer the model separately through encoder.onnx and decoder.onnx
onnx
Related to the ONNX export
#2002
opened Aug 29, 2024 by
pengpengtao
NameError: name '_SENTENCE_TRANSFORMERS_TASKS_TO_MODEL_LOADERS' is not defined
bug
Something isn't working
onnx
Related to the ONNX export
#1997
opened Aug 25, 2024 by
purejomo
2 of 4 tasks
AttributeError: FLOAT8E4M3FN
bug
Something isn't working
onnxruntime
Related to ONNX Runtime
#1994
opened Aug 21, 2024 by
Huanghong2016
3 of 4 tasks
ORTModelForCustomTasks lacks attributes
bug
Something isn't working
onnxruntime
Related to ONNX Runtime
#1992
opened Aug 19, 2024 by
TheMattBin
2 of 4 tasks
Previous Next
ProTip!
Updated in the last three days: updated:>2024-12-23.