-
Notifications
You must be signed in to change notification settings - Fork 234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for node.js aws sdk? #127
Comments
I successfully tested S3Proxy master with the latest AWS SDK and added an example here: Can you share your
Note that I had to set |
Perhaps your issue is related to using S3Proxy 1.4.0 which does not support AWSv4 signatures? You can either upgrade to master or specify |
I built from master. Here's my test_download.js:
I'll try the jclouds.regions thing and let you know if that works. |
No luck with that. I'm using the following proxy config (which works great with boto, btw.. able to upload and download huge files without issue)
The TRACE level logging from s3proxy all looks like this: |
GCS raises this error since their service does not support multi-part uploads: http://stackoverflow.com/a/27830881/2800111 Can you configure your application to use single-part uploads? Unfortunately this is not something that S3Proxy can work around. You can also try using GCS via its native |
Thanks, we had good luck with setting the partSize to a very large value.. that's good enough for us for now. I'd like to try your GCS native implementation idea soon, too - but are you saying s3proxy won't do multiple-part uploads to GCS at all? For completeness' sake here's our working upload script:
|
I finally got s3proxy working with both boto and spark and it's really quite cool.
I then tried the node.js aws SDK and it didn't work, regardless of whether I enabled authentication or not.
With auth turned off entirely:
With auth enabled:
The text was updated successfully, but these errors were encountered: