Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Corrupted cache failing the setup-node step #459

Closed
2 of 5 tasks
mrlubos opened this issue Apr 9, 2022 · 5 comments
Closed
2 of 5 tasks

Corrupted cache failing the setup-node step #459

mrlubos opened this issue Apr 9, 2022 · 5 comments
Labels
bug Something isn't working

Comments

@mrlubos
Copy link

mrlubos commented Apr 9, 2022

Description:
The setup step started failing in my CI pipeline and re-running all jobs does not fix it. The first time it failed was about 3 days ago and it appears to be affecting all pipelines using this action. The log output I receive is:

Run actions/setup-node@v3.0.0
Resolved .nvmrc as 16.14.2
Found in cache @ /opt/hostedtoolcache/node/16.14.2/x64
/usr/local/bin/yarn --version
1.22.18
/usr/local/bin/yarn cache dir
/home/runner/.cache/yarn/v6
Received 50331648 of [12](https://github.com/project/runs/5893219601?check_suite_focus=true#step:4:12)97180391 (3.9%), 48.0 MBs/sec
Received 243269632 of 1297180391 (18.8%), 115.9 MBs/sec
Received 436207616 of 1297180391 (33.6%), [13](https://github.com/project/runs/5893219601?check_suite_focus=true#step:4:13)8.6 MBs/sec
Received 595591168 of 1297180391 (45.9%), [14](https://github.com/project/runs/5893219601?check_suite_focus=true#step:4:14)1.9 MBs/sec
Received 759[16](https://github.com/project/runs/5893219601?check_suite_focus=true#step:4:16)9024 of 1297[18](https://github.com/project/runs/5893219601?check_suite_focus=true#step:4:18)0391 (58.5%), 144.7 MBs/sec
Received 918552576 of 1297180391 (70.8%), 145.9 MBs/sec
Received 1077936128 of 1297180391 (83.1%), 146.7 MBs/sec
Received 1[23](https://github.com/project/runs/5893219601?check_suite_focus=true#step:4:23)31[25](https://github.com/project/runs/5893219601?check_suite_focus=true#step:4:25)376 of 1[29](https://github.com/project/runs/5893219601?check_suite_focus=true#step:4:29)7180391 (95.1%), 146.8 MBs/sec
Received 1297180391 of 1297180391 (100.0%), 110.7 MBs/sec
Cache Size: ~1237 MB (1297180391 B)
/usr/bin/tar --use-compress-program zstd -d -xf /home/runner/work/_temp/ec086b0d-4c1b-4c8c-8ffa-8a4f608e92ef/cache.tzst -P -C /home/runner/work/project
/*stdin*\ : Decoding error (36) : Corrupted block detected 
/usr/bin/tar: Unexpected EOF in archive
/usr/bin/tar: Unexpected EOF in archive
/usr/bin/tar: Error is not recoverable: exiting now
Error: Tar failed with error: The process '/usr/bin/tar' failed with exit code 2

Action version:
v3.0.0

Platform:

  • Ubuntu
  • macOS
  • Windows

Runner type:

  • Hosted
  • Self-hosted

Tools version:

Repro steps:
A description with steps to reproduce the issue. If you have a public example or repo to share, please provide the link.

Expected behavior:
A description of what you expected to happen.

Actual behavior:
A description of what is actually happening.

@mrlubos mrlubos added bug Something isn't working needs triage labels Apr 9, 2022
@panticmilos
Copy link
Contributor

Hi @mrlubos,

We will investigate further this issue.

@mrlubos
Copy link
Author

mrlubos commented Apr 11, 2022

Thank you @panticmilos! This issue has disappeared for me now, one of the runs produced this result #453 so it didn't try to restore the cache and it passed

@panticmilos
Copy link
Contributor

Hello @mrlubos,

No problem. We are suspecting an issue with the cache to be one that's causing this issue, but there is already opened issue for this matter.

That is why I will close this issue now. If you have any questions or concerns, please feel free to leave a comment on this thread or open another issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants
@mrlubos @panticmilos and others