Skip to content

Commit

Permalink
include extraction in the retrying for submodule download
Browse files Browse the repository at this point in the history
The code to download larger submodules previously used retries around
the `curl` invocation to handle network failures, but we saw in recent
build failures that failures can also happen during extraction, for
example if a response got terminated early.

This commit moves the retry outwards, wrapping the whole
download+extraction function in the retrying code. This means, if the
extraction fails the tarball will be re-downloaded.
  • Loading branch information
pietroalbini committed Jan 10, 2023
1 parent 0442fba commit 48291f1
Showing 1 changed file with 5 additions and 3 deletions.
8 changes: 5 additions & 3 deletions src/ci/scripts/checkout-submodules.sh
Original file line number Diff line number Diff line change
Expand Up @@ -23,15 +23,17 @@ fi
function fetch_github_commit_archive {
local module=$1
local cached="download-${module//\//-}.tar.gz"
retry sh -c "rm -f $cached && \
curl -f -sSL -o $cached $2"
rm -f "${cached}"
rm -rf "${module}"
curl -f -sSL -o "${cached}" "$2"
mkdir $module
touch "$module/.git"
# On Windows, the default behavior is to emulate symlinks by copying
# files. However, that ends up being order-dependent while extracting,
# which can cause a failure if the symlink comes first. This env var
# causes tar to use real symlinks instead, which are allowed to dangle.
export MSYS=winsymlinks:nativestrict
mkdir -p "${module}"
tar -C $module --strip-components=1 -xf $cached
rm $cached
}
Expand All @@ -50,7 +52,7 @@ for i in ${!modules[@]}; do
git rm $module
url=${urls[$i]}
url=${url/\.git/}
fetch_github_commit_archive $module "$url/archive/$commit.tar.gz" &
retry fetch_github_commit_archive $module "$url/archive/$commit.tar.gz" &
bg_pids[${i}]=$!
continue
else
Expand Down

0 comments on commit 48291f1

Please sign in to comment.