-
Notifications
You must be signed in to change notification settings - Fork 787
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
kv:bulk only accepts string values #571
Comments
This issue has been automatically marked as stale because it has not had recent activity in the last 180 days. It will be closed if no further activity occurs in the next week. Please feel free to comment if you'd like it to remain open, and thank you for your contributions. |
This issue has been automatically closed because it has not had recent activity. You may re-open the issue if it is still relevant. |
I don't understand this request. A Workers KV namespace is only able to store string values. The REST API and the dashboard, as far as I can tell, also only allow you to store strings. Of course one can JSON encode data and store it in the KV namespace, but this is the responsibility of the application. Am I missing something? |
KV doesn't have to store strings, it can store arbitrary byte arrays or readable streams, https://developers.cloudflare.com/workers/runtime-apis/kv/#writing-key-value-pairs The REST api will accept "Content-Type: application/json" |
Ah thanks for that @koeninger - this wasn't clear from the API: https://api.cloudflare.com/#:~:text=A%20UTF%2D8%20encoded%20string%20to%20be%20stored%2C%20up%20to%2010%20MB%20in%20length. So currently the
But as it turns out Wrangler2 does not check that So... I wonder if this is actually already fixed in Wrangler2? |
Hmm, it appears that the REST API for bulk uploads does not support non-string values. when I try uploading the following (or any other non-string value) [
{
"key": "test",
"value": { "str": "a string", "num": 123 }
}
]
|
So it looks like Wrangler2 supports this but the Edge Worker Config REST API does not. |
Ah, sorry, I was responding to the comment about KV being string only and didn't notice this was about the bulk api specifically. Yeah the bulk api isn't set up to take nested json for each value. Not sure what @nr-goby 's original ask was given that. |
Since the original poster has not actually commented in almost a year, this is definitely not a high priority. |
Now that some time has passed and we've been able to think about this, I'm of the opinion that we shouldn't do anything special for uploading non-string values. Further, we can and should apply client side validation, ensuring that only string values are being sent before uploading. I'll send a PR that does so. |
This adds client side validation for the paylod for `kv:bulk put`, importantly ensuring we're uploading only string key/value pairs (as well as validation for the other fields). Fixes #571
This adds client side validation for the paylod for `kv:bulk put`, importantly ensuring we're uploading only string key/value pairs (as well as validation for the other fields). Fixes #571
This adds client side validation for the paylod for `kv:bulk put`, importantly ensuring we're uploading only string key/value pairs (as well as validation for the other fields). Fixes #571
This adds client side validation for the paylod for `kv:bulk put`, importantly ensuring we're uploading only string key/value pairs (as well as validation for the other fields). Fixes #571
Currently the kv:bulk tool will only upload values that are strings. This is different from the KV web back end when you can upload json values. The behaviour seems to come from cloudflare::endpoints::workerskv::write_bulk::KeyValuePair. Would changing the type of KeyValuePair.value from String to json::JsonValue be sufficient to fix this? I'm not sure if this library is used for more than wrangler or not so it might be cumbersome to make API changes but it is a cloudflare library.
Or if there is some workaround that I am unaware of, this request could be quickly closed.
The text was updated successfully, but these errors were encountered: