-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[tflite] Uncaught RuntimeError: Aborted(). Build with -s ASSERTIONS=1 for more info. #6403
Comments
I'm also seeing a similar error with this model in this minimal example: https://jsbin.com/hogagolupi/edit?html,output
Edit: Actually, this seems unrelated, since this model actually manages to initialize correctly and the error occurs during inference. I guess it's related in the sense that the error message doesn't tell the user what the problem was. Is there any way to allow this? E.g. to publish a dev build to npm/jsdelivr that includes the actual error messages? |
I'm seeing this again with this Lyra model here: https://jsbin.com/kequtiwuze/edit?html,output The model seems to load correctly without errors, but then I get this during inference:
@jinjingforever Would it be possible to get a build of tfjs-tflite that includes that |
Hi @josephrocca I uploaded the version with debugging flags (
Now at least I know the error is coming from the To fix that, you will need to run the code on a server with the "cross-origin isolation" support. To test it locally, I often use the following command to start a local server that sends the coop headers correctly:
And I tried your code and it works. Notice that it is loading the simd_threaded version of the wasm binary. Hope it helps! |
@jinjingforever Ahh, easy fix - thanks! I've used this little service worker trick to get it working on Github Pages (which doesn't allow setting COOP/COEP headers). And thanks for providing that debug build! It'll come in handy for future debugging. But also: This seems like a bug - i.e. it shouldn't be automatically loading a non-threaded version of the runtime if threads aren't available? Or is the |
That's a very good point. I will need to spend some time looking into the source code and see how this thing works. For now, as you said, looks like some info in the model kind of determines whether the threaded version of the op/kernel will be used or not. Will investigate more. Thank you! |
I get the error in the title of this issue (with no other explanation) when loading this tflite file with
tflite.loadTFLiteModel
. The tflite file was generated from this SavedModel. The SavedModel can be loaded and inferenced without any errors, and the results are correct.The SavedModel was converted to tflite with TensorFlow version
2.8.0
. You can see the versions of the tfjs files that I used in the code below.Minimal replication:
You can click this link and then open the browser console: https://jsbin.com/melekuxena/edit?html,output Note that the tflite file is ~120mb, so it might take a minute or so to download depending on your connection speed.
Full error:
The text was updated successfully, but these errors were encountered: