-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
aws s3 sync skips many files with error "File does not exist." #3514
Comments
Yes same issue on windows 2012 R2 with cli version 1.16 PS C:> aws --version Tried sync file to s3 and got below error. PS C:> aws s3 sync test.conf s3://XXXX cp command is working whereas sync command say file not exist |
@Typel - Thanks for reaching out. I am investigating this issue and trying to reproduce the same results.
Currently, the best way to capture a debug log to a file is to redirect stderr to a file (#642):
I found this issue appears to be related to #1082. We have also seen the "File does not exist" when there are permission restrictions. Please reply and let us know if this is still an issue. |
Hello @justnance thanks for taking the time to look into this issue. I just went ahead and updated to the latest aws-cli version and ran the command you specified and then searched the debug log for the same error. In short, it is still occurring quite often. I'm thinking maybe it has to do with long folder/filenames since the affected files seem to be those with long names. Unfortunately, I can't just rename the files since we are using this for backing up an existing system as-is.
Some more examples of the error from the resulting log:
|
@Typel - Thanks for the update and additional information. I'm looking into this one again and will post another update soon. |
I've been getting the same error message with |
The same thing is happening to me. The sync was working yesterday, and now today every file is getting skipped. I copied and pasted the local path into windows explorer and the file is definitely there and opens. I had left the sync running during the night, and at some point it just stopped working and started skipping all the files. |
Note, when I go to a different computer and run it, it works! I closed all command prompt windows and tried again, still skips all files on this server. I cannot really reboot the server, since there are a lot of other things running on it. Is there any process I can kill? It almost seems like aws cli has left something in memory? |
Hi, I'm getting the same error on one of my servers at the moment. I'll investigate further and come back |
Same problem for me, multiple 'File does not exist' warning on my Win server 2012 R2 and a few 'File has an invalid timestamp. Passing epoch time as timestamp.' warning. UPDATE: had another warning, this time is "File/Directory is not readable." |
I was able to reboot the server and the problem was resolved. Before the reboot, I also tried Cloudberry Explorer and it worked for a time, then started throwing Out of Memory errors instantly when I tried to copy a file. There were some registry changes suggested, perhaps some of those helped and didn't take effect until the reboot. All I know is, I did a sync after the reboot and it finished eventually. |
Solution(Answer) -- Yes, Same problem was occurring for me also i.e. 'File does not exist' warning. |
I also have something like this problem. it appears the filter isn't strict enough, since I can't explain why it would even try to sync these files it complains about at all - it shouldn't
Above you can see it is complaining about files that shouldn't match the --include condition. Id love any pointers on this. my goal is to use sync instead of cp because sync will only copy if there are changes, and usualy in my case there will be many files like 0001-9999. |
It does appear that this (the original issue here) was caused by a 260-character path limitation imposed by Windows. Microsoft only recently lifted this limitation in modern versions, although it requires a registry hack to unlock. Basically, as aws-cli syncs up to S3, if any of the local files are in folders with full paths longer than 260 characters, Windows acts as if the file just isn't there. Note that this limitation was being imposed locally since the S3 max path is 1024 characters. For anyone still looking for a workaround - you will need to either upgrade the client machine to an operating system that can handle long paths (or unlock long paths in Windows 10), or rename folders so that all paths being synced are < 260 characters long. If my suspicion is correct and the issue is based on a limitation of the operating system, I think an outright fix is most likely outside the scope of this project. |
I have the same issue when executing the command on a Ubuntu machine. |
@ppulusu I believe the max filename length is 255 characters for most filesystems in Ubuntu, and a max path length of 4096 characters. S3 itself imposes a 1024-character limit (although I'm not certain whether that is path-inclusive or just for the filename itself). If you run a sync in debug mode you should be able to examine the resulting log and see if the offending files fall outside any of those limitations. |
hi, I get many warnings like So no idea why I get this warning. |
i have the same problem, and it is windows problem that can not read path more than 255 characters |
I hit this error because I was invoking the sync command incorrectly. I was using: aws s3 sync ./bundle-11500000.tar.gz s3://duhaime/ip-data-lab/bundle-11500000.tar.gz I moved all my aws s3 sync bundles s3://duhaime/ip-data-lab/bundles/ |
The
aws s3 sync
command skips many files, complaining that the "File does not exist." even though it does (I checked).I thought it might be caused by unconventional or long filenames, but even counting the full path they still come in well under the 1024 character limit imposed by S3 (longest was around 300 characters, though the filename itself was only 80). Furthermore, none of the skipped files have any strange characters in their names; they are basic alphanumeric. Some filenames did have spaces, some had a combination of dashes and/or underscores, and all had a 3-letter extension (such as
.pdf
).I also compared file security settings and ownership between files that did and did not transfer from within the same folder - they are identical.
Steps to Reproduce:
aws s3 sync C:\resources\ s3:/backup --delete --debug
warning: Skipping file C:\resources\director\National Data\Activity and Expense Report\2011\November\ATA-R Report 11-12-2011\ATA A-E Report 11-12-11 Parking Fees.pdf. File does not exist.
More Debug Info
Below is some of the debug messaging produced just before and just after the errors in question:
Please let me know if there is anything else I can/should provide. I would have pasted more of the log in here but Windows has a pretty short command line buffer. Is there maybe a way to redirect the
--debug
output to a file?The text was updated successfully, but these errors were encountered: