Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SetMaxWasmStack fails with async_stack_size error #182

Closed
guregu opened this issue Jul 24, 2023 · 6 comments
Closed

SetMaxWasmStack fails with async_stack_size error #182

guregu opened this issue Jul 24, 2023 · 6 comments

Comments

@guregu
Copy link
Contributor

guregu commented Jul 24, 2023

As mentioned in bytecodealliance/wasmtime#6762. I figure it's better to discuss wasmtime-go stuff here.

Currently doing this:

cfg := wasmtime.NewConfig()
cfg.SetMaxWasmStack(8388608)
engine := wasmtime.NewEngineWithConfig(cfg) // panics

Fails with this message:

thread '<unnamed>' panicked at 'called `Result::unwrap()` on an `Err` value: max_wasm_stack size cannot exceed the async_stack_size', crates/c-api/src/engine.rs:33:38
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
fatal runtime error: failed to initiate panic, error 5
SIGABRT: abort
PC=0x198cf4724 m=0 sigcode=0
signal arrived during cgo execution

goroutine 1 [syscall, locked to thread]:
runtime.cgocall(0x104609b88, 0x14000133c38)
	/opt/homebrew/Cellar/go/1.20.5/libexec/src/runtime/cgocall.go:157 +0x54 fp=0x14000133c00 sp=0x14000133bc0 pc=0x10438be14
github.com/bytecodealliance/wasmtime-go/v11._Cfunc_wasm_engine_new_with_config(0x153804340)
	_cgo_gotypes.go:1016 +0x38 fp=0x14000133c30 sp=0x14000133c00 pc=0x1044c0b78
github.com/bytecodealliance/wasmtime-go/v11.NewEngineWithConfig.func1(0x153804340?)
	/Users/guregu/go/pkg/mod/github.com/bytecodealliance/wasmtime-go/v11@v11.0.0/engine.go:33 +0x54 fp=0x14000133c70 sp=0x14000133c30 pc=0x1044c2ee4
github.com/bytecodealliance/wasmtime-go/v11.NewEngineWithConfig(0x140000161f0)
	/Users/guregu/go/pkg/mod/github.com/bytecodealliance/wasmtime-go/v11@v11.0.0/engine.go:33 +0x28 fp=0x14000133cb0 sp=0x14000133c70 pc=0x1044c2df8
github.com/trealla-prolog/go/trealla.init.0()
	/Users/guregu/code/trealla/go/trealla/wasm.go:31 +0x84 fp=0x14000133cf0 sp=0x14000133cb0 pc=0x1045fb704
runtime.doInit(0x105737ca0)
       ...

This led to me trying to set the async_stack_size, noticing it was missing from C API, and the wasmtime PR.

@guregu
Copy link
Contributor Author

guregu commented Jul 24, 2023

Oddly enough, I can't reproduce it when writing a test case against this repo

func TestEngineMaxStackSize(t *testing.T) {
	config := NewConfig()
	config.SetMaxWasmStack(8388608) // 8MiB
	engine := NewEngineWithConfig(config)
	require.NotNil(t, engine)
}

This passes with no issue.

But the same usage fails in trealla-prolog/go. I'll see if I can isolate it better.

@alexcrichton
Copy link
Member

Are you using a custom build of Wasmtime's C API? The check that you're failing is behind #[cfg(feature = "async")] which I think is disabled for the builds this repository uses but if you've got a custom build that feature may be enabled.

I also posted over on the Wasmtime PR, but I think this is still a bug to fix in Wasmtime.

@guregu
Copy link
Contributor Author

guregu commented Jul 24, 2023

I am seeing the same issue on GitHub actions, so it's probably not an environment issue, but still not sure of the root cause.
https://github.com/trealla-prolog/go/actions/runs/5649469168/job/15303868806
https://github.com/trealla-prolog/go/blob/b5c257d2a300289b14de160b07e79cecb182fd49/trealla/wasm.go#L28
I'll investigate more.

@guregu
Copy link
Contributor Author

guregu commented Jul 24, 2023

Oh, I think you were right after all about the custom build thing, I was just thinking in the wrong direction.
My custom build was OK, but replacing that with the libwasmtime.a from the go mod cache leads to the same panic.
So it seems like the released version of libwasmtime.a has async enabled.

@guregu
Copy link
Contributor Author

guregu commented Jul 24, 2023

Sent a quick PR with a little test in #183, it fails in CI so I think we've figured it out.

@guregu
Copy link
Contributor Author

guregu commented Aug 26, 2023

Confirmed that this is fixed with the latest version. Feel free to close or merge #183, which should be passing now.

@guregu guregu closed this as completed Aug 26, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants