-
-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to create TFLite interpreter from model when passing core-ml argument #97
Comments
Guten Tag, Hans here! 🍻 It looks like you are experiencing an issue when using ze Additionally, if you want quicker responses or more support, consider sponsoring the project. This shows your support for mrousavy's work! Looking forward to your logs!
|
This is valid. My original issue was auto-closed by the hans bot, but it should be open. It includes Logs & example repo: #84 This was only on Android, but now iOS too |
Same issue here when setting 'android-gpu' in useTensorflowModel Without android-gpu, it loads. |
I am on version |
I have a brand new Expo project and I'm trying to load a tflite model via this library. It works fine (but my app slows down too much) without passing the
core-ml
argument. When I pass core-ml I get this error.This is running on my local machine via the iOS simulator. Any ideas why this is happening when I pass this option?
Here is my package.json for reference
The text was updated successfully, but these errors were encountered: