Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Allow user to set max message size limit on grpc source #1357

Closed
pradeepbbl opened this issue Jul 10, 2024 · 2 comments · Fixed by #1358
Closed

[FEATURE] Allow user to set max message size limit on grpc source #1357

pradeepbbl opened this issue Jul 10, 2024 · 2 comments · Fixed by #1358
Assignees
Labels
enhancement New feature or request

Comments

@pradeepbbl
Copy link
Member

Requirements

Hi,

As discussed in the community slack channel, currently we are using the default MaxMsgSize 4MB in gRPC sync source in most cases it is large enough but in bigger installation it could be issue e.g.

2024-07-10T14:35:17.327Z	warn	grpc/grpc_sync.go:123	error with stream listener: error receiving payload from stream: rpc error: code = ResourceExhausted desc = grpc: received message larger than max (5727401 vs. 4194304)	{"component": "sync", "sync": "grpc"}

This task is to allow users to set MaxMsgSize limit part of source configuration.

-Thanks

@pradeepbbl pradeepbbl added enhancement New feature or request Needs Triage This issue needs to be investigated by a maintainer labels Jul 10, 2024
@beeme1mr beeme1mr removed the Needs Triage This issue needs to be investigated by a maintainer label Jul 10, 2024
@Kavindu-Dodan
Copy link
Contributor

Kavindu-Dodan commented Jul 10, 2024

I think the root cause here is not the flag payload size but a data corruption that was reported in grpc java. Flagd Java provider v0.8.4 [1] fixes this as we updated the dependency causing this [2].

But let me know if you are seeing this non-Java provider. Then that's an interesting situation.

Update - Assuming the usage here is flagd connecting to a Java-based gRPC stream, the same root cause could have created this issue. 4MB should allow at least 8k ~ 10K flagd flags to be transmitted without altering the max message size.

[1] - https://github.com/open-feature/java-sdk-contrib/releases/tag/dev.openfeature.contrib.providers.flagd-v0.8.4
[2] - open-feature/java-sdk-contrib#849 (comment)

Kavindu-Dodan pushed a commit that referenced this issue Jul 12, 2024
…message size. (#1358)

## This PR
Added a new config option `maxMsgSize` which will
allow users to override default message size 4Mb.

- allow override default max message size 4Mb

### Related Issues

Fixes #1357 

### How to test
To test this feature you need a flag source with large number of flags >
4Mb, e.g. below we are running with max size of 5Mb+
 
```
./flagd start --port 8013 --sync-port 8017 --sources='[{"uri": "localhost:8015", "provider": "grpc", "tls": false, "providerID": "flagd-sidecar", "selector": "all-flags", "maxMsgSize": 5728474}]'

		 ______   __       ________   _______    ______
		/_____/\ /_/\     /_______/\ /______/\  /_____/\
		\::::_\/_\:\ \    \::: _  \ \\::::__\/__\:::_ \ \
		 \:\/___/\\:\ \    \::(_)  \ \\:\ /____/\\:\ \ \ \
		  \:::._\/ \:\ \____\:: __  \ \\:\\_  _\/ \:\ \ \ \
		   \:\ \    \:\/___/\\:.\ \  \ \\:\_\ \ \  \:\/.:| |
		    \_\/     \_____\/ \__\/\__\/ \_____\/   \____/_/

2024-07-11T11:31:57.024+0200	info	cmd/start.go:107	flagd version: dev (da01e08), built at: 2024-07-11T11:14:44Z	{"component": "start"}
2024-07-11T11:31:57.026+0200	info	flag-sync/sync_service.go:54	starting flag sync service on port 8017	{"component": "FlagSyncService"}
2024-07-11T11:31:57.027+0200	info	grpc/grpc_sync.go:70	setting max receive message size 5728474 bytes default 4MB	{"component": "sync", "sync": "grpc"}

```

---------

Signed-off-by: Pradeep Mishra <pradeepbbl@gmail.com>
Signed-off-by: pradeepbbl <pradeepbbl@gmail.com>
Signed-off-by: Pradeep Mishra <pradeepbbl@users.noreply.github.com>
Co-authored-by: Michael Beemer <beeme1mr@users.noreply.github.com>
@Kavindu-Dodan
Copy link
Contributor

@pradeepbbl thank you for the contribution. I have merged your solution. This will be included with our next release [1]

[1] - #1356

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants