Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tflite] Uncaught RuntimeError: Aborted(). Build with -s ASSERTIONS=1 for more info. #6403

Closed
josephrocca opened this issue May 12, 2022 · 6 comments
Assignees
Labels
type:bug Something isn't working

Comments

@josephrocca
Copy link
Contributor

I get the error in the title of this issue (with no other explanation) when loading this tflite file with tflite.loadTFLiteModel. The tflite file was generated from this SavedModel. The SavedModel can be loaded and inferenced without any errors, and the results are correct.

The SavedModel was converted to tflite with TensorFlow version 2.8.0. You can see the versions of the tfjs files that I used in the code below.

Minimal replication:

<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-core@3.16.0/dist/tf-core.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-backend-cpu@3.16.0/dist/tf-backend-cpu.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-tflite@0.0.1-alpha.8/dist/tf-tflite.min.js"></script>

<script type="module">
  let tfliteModel = await tflite.loadTFLiteModel('https://huggingface.co/rocca/lit-web/resolve/main/debug/lit_savedmodel_no_params.tflite');
</script>

You can click this link and then open the browser console: https://jsbin.com/melekuxena/edit?html,output Note that the tflite file is ~120mb, so it might take a minute or so to download depending on your connection speed.

Full error:

Uncaught RuntimeError: Aborted(). Build with -s ASSERTIONS=1 for more info.
    at abort (tflite_web_api_cc_simd.js:9:9277)
    at _abort (tflite_web_api_cc_simd.js:9:59963)
    at tflite_web_api_cc_simd.wasm:0x3501
    at tflite_web_api_cc_simd.wasm:0x8a9f3
    at tflite_web_api_cc_simd.wasm:0x210c13
    at tflite_web_api_cc_simd.wasm:0x1075aa
    at tflite_web_api_cc_simd.wasm:0x11331
    at tflite_web_api_cc_simd.wasm:0x8ce44
    at tflite_web_api_cc_simd.wasm:0x8c9c7
    at tflite_web_api_cc_simd.wasm:0x8c696
@josephrocca
Copy link
Contributor Author

josephrocca commented May 14, 2022

I'm also seeing a similar error with this model in this minimal example: https://jsbin.com/hogagolupi/edit?html,output

Uncaught (in promise) RuntimeError: Aborted(). Build with -s ASSERTIONS=1 for more info.
    at abort (tflite_web_api_cc_simd.js:9:9277)
    at _abort (tflite_web_api_cc_simd.js:9:59963)
    at tflite_web_api_cc_simd.wasm:0x1e418
    at tflite_web_api_cc_simd.wasm:0x2c4ac6
    at tflite_web_api_cc_simd.wasm:0x39623
    at tflite_web_api_cc_simd.wasm:0x49174
    at tflite_web_api_cc_simd.wasm:0x3084f7
    at tflite_web_api_cc_simd.wasm:0x17b2a
    at TFLiteWebModelRunner$Infer [as Infer] (eval at new_ (tflite_web_api_cc_simd.js:9:37941), <anonymous>:8:10)
    at module$exports$google3$third_party$tensorflow_lite_support$web$tflite_web_api_client.TFLiteWebModelRunner.infer 

Edit: Actually, this seems unrelated, since this model actually manages to initialize correctly and the error occurs during inference. I guess it's related in the sense that the error message doesn't tell the user what the problem was. Is there any way to allow this? E.g. to publish a dev build to npm/jsdelivr that includes the actual error messages?

@josephrocca
Copy link
Contributor Author

I'm seeing this again with this Lyra model here: https://jsbin.com/kequtiwuze/edit?html,output

The model seems to load correctly without errors, but then I get this during inference:

Uncaught (in promise) RuntimeError: Aborted(). Build with -sASSERTIONS for more info.
    at abort (tflite_web_api_cc_simd.js:9:6515)
    at _abort (tflite_web_api_cc_simd.js:9:60528)
    at VM41 tflite_web_api_cc_simd.wasm:1:113326
    at VM41 tflite_web_api_cc_simd.wasm:1:3022728
    at VM41 tflite_web_api_cc_simd.wasm:1:194824
    at VM41 tflite_web_api_cc_simd.wasm:1:336625
    at VM41 tflite_web_api_cc_simd.wasm:1:3322193
    at VM41 tflite_web_api_cc_simd.wasm:1:97813
    at TFLiteWebModelRunner.TFLiteWebModelRunner$Infer [as Infer] (eval at new_ (tflite_web_api_cc_simd.js:9:33059), <anonymous>:8:10)
    at module$exports$google3$third_party$tensorflow_lite_support$web$tflite_web_api_client.TFLiteWebModelRunner.infer (tf-tflite.js:10011:35)

@jinjingforever Would it be possible to get a build of tfjs-tflite that includes that ASSERTIONS flag? Something like tfjs-tflite@0.0.1-alpha.9/dist/tf-tflite.debug.js?

@jinjingforever
Copy link
Collaborator

Hi @josephrocca

I uploaded the version with debugging flags (-s ASSERTIONS=1, -g2) turned on here: https://storage.googleapis.com/tfweb/0.0.1-alpha.9-debug/dist/tf-tflite.js. I used it with your jsbin example and it shows the following error:

Uncaught (in promise) RuntimeError: Aborted(native code called abort())
    at abort (tflite_web_api_cc_simd.js:580:10)
    at _abort (tflite_web_api_cc_simd.js:2865:2)
    at std::__2::__throw_system_error(int, char const*) (VM163 tflite_web_api_cc_simd.wasm:1:3446564)
    at TfLiteStatus tflite::ops::builtin::conv::Eval<(tflite::ops::builtin::conv::KernelType)2>(TfLiteContext*, TfLiteNode*) (VM163 tflite_web_api_cc_simd.wasm:1:1240794)
    ....

Now at least I know the error is coming from the conv op. I then had to look into the conv.cc file and looks like the error is thrown here. This tells me that tflite is trying to run this op with multithreaded support, but the web app is not loading the multithreaded version of wasm binary. It is currently loading the simd only version (tflite_web_api_cc _simd .wasm). The reason of that is because of this "cross-origin isolation" thing. Read more here. Basically jsbin is not able send the correct headers to make the app load the simd-threaded version of the wasm binary.

To fix that, you will need to run the code on a server with the "cross-origin isolation" support. To test it locally, I often use the following command to start a local server that sends the coop headers correctly:

npx node-static -a 0.0.0.0 -H '{"Cross-Origin-Opener-Policy": "same-origin", "Cross-Origin-Embedder-Policy": "require-corp"}'

And I tried your code and it works. Notice that it is loading the simd_threaded version of the wasm binary.

Hope it helps!

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@josephrocca
Copy link
Contributor Author

josephrocca commented Oct 5, 2022

@jinjingforever Ahh, easy fix - thanks! I've used this little service worker trick to get it working on Github Pages (which doesn't allow setting COOP/COEP headers). And thanks for providing that debug build! It'll come in handy for future debugging.

But also: This seems like a bug - i.e. it shouldn't be automatically loading a non-threaded version of the runtime if threads aren't available? Or is the .tflite file somehow specifying that threads are a hard requirement for that op?

@jinjingforever
Copy link
Collaborator

That's a very good point. I will need to spend some time looking into the source code and see how this thing works. For now, as you said, looks like some info in the model kind of determines whether the threaded version of the op/kernel will be used or not. Will investigate more. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants