Replies: 2 comments
-
This should work but I don't provide free support sorry. Check the support policy |
Beta Was this translation helpful? Give feedback.
0 replies
-
In case anybody else in this community has this issue the Error is related to the AWS defaults for MultiPart uploads. The AWS default upload part size is 5MB, which is the minimum S3 part size. The maximum number of parts for S3 objects is 10,000. These default settings can handle content upload up to 50GB. So I increased the part size to 50 MB, retested and it worked fine. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi
I'm evaluating SFTPGo as a possible SFTP Server for use with the company I work for.
So far it looks like a very good product and its obvious a lot of work has been done to get SFTPgo to this level so well done.
I've SFTPGo installed on a Ubuntu VM , with a 256GB Disk.
I have some Virtual folder mapped to a S3 Bucket for testing.
I uploaded a 1GB file to S3 successfully
I then tried to upload a 50GB file but after ~30mins I got a error "Error saving file "/lm/large_file_50GB": write /tmp/pipefile2731556240: file already closed"
Any Ideas ?
Beta Was this translation helpful? Give feedback.
All reactions