Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

panic: send on closed channel #6854

Closed
yutkin opened this issue Dec 13, 2023 · 5 comments · Fixed by #6856
Closed

panic: send on closed channel #6854

yutkin opened this issue Dec 13, 2023 · 5 comments · Fixed by #6856

Comments

@yutkin
Copy link

yutkin commented Dec 13, 2023

What version of gRPC are you using?

1.60

What version of Go are you using (go version)?

1.21.3

What operating system (Linux, Windows, …) and version?

cos_containerd on GKE

What did you do?

After upgrading from 1.59 to 1.60 we started to get panics from time to time. We didn't change anything else.

panic: send on closed channel

goroutine 4816 [running]:
google.golang.org/grpc.(*Server).serveStreams.func2(0xc08b3ba000)
	/src/vendor/google.golang.org/grpc/server.go:1021 +0x107
google.golang.org/grpc/internal/transport.(*http2Server).operateHeaders(0xc018389040, {0x18823f0, 0xc027ddf4d0}, 0xc0859423f0, 0xc027ddf560)
	/src/vendor/google.golang.org/grpc/internal/transport/http2_server.go:603 +0x25da
google.golang.org/grpc/internal/transport.(*http2Server).HandleStreams(0xc018389040, {0x18823f0, 0xc027ddf4d0}, 0x187cdb0?)
	/src/vendor/google.golang.org/grpc/internal/transport/http2_server.go:648 +0x23a
google.golang.org/grpc.(*Server).serveStreams(0xc00027be00, {0x1882150?, 0x273ab80?}, {0x1889140?, 0xc018389040}, {0x1888448?, 0xc000ab0000?})
	/src/vendor/google.golang.org/grpc/server.go:1012 +0x3e2
google.golang.org/grpc.(*Server).handleRawConn.func1()
	/src/vendor/google.golang.org/grpc/server.go:939 +0x56
created by google.golang.org/grpc.(*Server).handleRawConn in goroutine 4124
	/src/vendor/google.golang.org/grpc/server.go:938 +0x1aa

We use Proxyless Traffic Director in GCP.

What did you expect to see?

What did you see instead?

@easwars
Copy link
Contributor

easwars commented Dec 13, 2023

Thank you for reporting this issue. It looks like #6489 might be the culprit, but I'm not certain at this point. Needs more investigation.

Do you have any way to reproduce this?

@easwars
Copy link
Contributor

easwars commented Dec 13, 2023

Also, do you use the NumStreamWorkers server option?

@yutkin
Copy link
Author

yutkin commented Dec 13, 2023

Do you have any way to reproduce this?

We don't now, unfortunately. We've just reverted this when we noticed the issue.

Also, do you use the NumStreamWorkers server option?

Yes, we do. We set this the same as GOMAXPROCS and pod's CPU requests/limits.

@easwars
Copy link
Contributor

easwars commented Dec 13, 2023

Are the panics happening around the time when Stop/GracefulStop is being called on your grpc server?

@yutkin
Copy link
Author

yutkin commented Dec 14, 2023

Are the panics happening around the time when Stop/GracefulStop is being called on your grpc server?

Yes, they are happening right after the GracefulStop

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants