Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

InstanceNormalization op works with cpu backend, but fails with wasm backend. #102

Closed
gnsmrky opened this issue Mar 6, 2019 · 2 comments · Fixed by #104
Closed

InstanceNormalization op works with cpu backend, but fails with wasm backend. #102

gnsmrky opened this issue Mar 6, 2019 · 2 comments · Fixed by #104

Comments

@gnsmrky
Copy link

gnsmrky commented Mar 6, 2019

Hi @hariharans29, I just tried out the newly InstanceNormalization op, as in #18 . With cpu backend, it works good.

But when using wasm backend, it gives the following RuntimeError error when running with Chrome browser (72.0.3626.119) on Windows 10:

Uncaught (in promise) RuntimeError: memory access out of bounds
    at wasm-function[49]:376
    at t._instance_normalization_f32 (https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js:21:64093)
    at e.t.func (https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js:21:48678)
    at e.t.ccall (https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js:21:48067)
    at e.run (https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js:21:81058)
    at t.<anonymous> (https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js:21:223361)
    at https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js:21:220510
    at Object.next (https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js:21:220615)
    at https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js:21:219528
    at new Promise (<anonymous>)
wasm-function[49]	@	wasm-0000006e-49:205
t._instance_normalization_f32	@	onnx-wasm.js:8
t.func	@	wasm-binding-core.ts:154
t.ccall	@	wasm-binding-core.ts:116

Here is the code:

<script src="https://cdn.jsdelivr.net/npm/onnxjs@0.1.4/dist/onnx.min.js"></script>
<script>
    const sess = new onnx.InferenceSession({backendHint: 'wasm'});

    sess.loadModel("./onnx_models/candy_128x128_wasm.onnx").then(()=>{

    const x = new Float32Array(1*3*128*128).fill(1);
    const inputT = new onnx.Tensor(x, 'float32', [1,3,128,128]);

    sess.run([inputT]).then(output=>{
        const outputT = output.values().next().value;

        //console.log(`model output tensor: ${outputT.data}.`);
        console.log(`model output tensor size: ${outputT.size}`);
        });
    });
</script>

I've setup a GitHub page to repro this issue. Both onnx-wasm.wasm and onnx-worker.js are in the same directory as insnorm_test.html as instructed.
https://gnsmrky.github.io/pytorch-fast-neural-style-onnxjs/insnorm_test.html

Please let me know what to do to help out.

@hariharans29
Copy link
Member

Hi,

Thanks for reporting this. This is strange. Let me take a look tomorrow.

Thanks

@fs-eire
Copy link
Contributor

fs-eire commented Mar 23, 2019

This issue is fixed in v0.1.5.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants