Skip to content

Commit

Permalink
merge upstream changes (#8)
Browse files Browse the repository at this point in the history
* hypre 2.26.0 (spack#33299)

* Github Discussions can be used for Q&A (spack#33364)

* Support spackbot rebuilding all specs from source (spack#32596)

Support spackbot rebuilding all specs from source when asked (with "rebuild everything")

- Allow overriding --prune-dag cli opt with env var
- Use job variable to optionally prevent rebuild jobs early exit behavior
- ci rebuild: Use new install argument to insist deps are always installed from binary, but
package is only installed from source.
- gitlab: fix bug w/ untouched pruning
- ci rebuild: install from hash rather than json file
- When doing a "rebuild everything" pipeline, make sure that each install job only consumes
binary dependencies from the mirror being populated by the current pipeline.  This avoids
using, e.g. binaries from develop, when rebuilding everything on a PR.
- When running a pipeline to rebuild everything, do not die because we generated a hash on
the broken specs list.  Instead only warn in that case.
- bugfix: Replace broken no-args tty.die() with sys.exit(1)

* python: add 3.10.7, 3.9.14, 3.8.14, 3.7.14 (spack#32623)

* Add checksum for py-secretstorage 3.3.3 (spack#33366)

* Add checksum for py-ipykernel 6.15.2 (spack#33360)

* GnuPG: add v2.3.8 and update stack (spack#33368)

* installer.py: traverse_dependencies has local deptype (spack#33367)

Currently `traverse_dependencies` fixes deptypes to traverse once and
for all in the recursion, but this is incorrect, since deptypes depend
on the node (e.g. if it's a dependency and cache-only, don't follow
build type edges, even if the parent is build from sources and needs
build deps.)

* py-horovod: add v0.26 (spack#33311)

* py-horovod: add v0.26

* py-petastorm: add v0.12.0

* database: don't warn adding missing build deps (spack#33361)

When installing an individual spec `spack --only=package --cache-only /xyz`
from a buildcache, Spack currently issues tons of warnings about
missing build deps (and their deps) in the database.

This PR disables these warnings, since it's fine to have a spec without
its build deps in the db (they are just "missing").

* Classic Intel compilers do not support gcc-toolchain (spack#33281)

* Classic Intel compilers do not support gcc-toolchain

This fix removes `--gcc-toolchain=` from the ~.fcg` files for the classic Intel
compilers. AFAIK this option is only supported for Clang based compilers.

This lead to an issue when installing cmake. Reproducer:
```
spack install cmake@3.24.2%intel@2021.7.0~doc+ncurses+ownlibs~qt
build_type=Release arch=linux-amzn2-skylake_avx512
```

Tagging maintainer @rscohn2

* Add `-gcc-name` for icc

.. and `-gxx-name` for icpc.

AFAIK this is used for modern C++ support, so we can ignore `ifort`.

Co-authored-by: Stephen Sachs <stesachs@amazon.com>

* py-xopen: version bump to 1.6.0 (spack#33231)

* version bump to 1.6.0

* added py-isal, updated URL

* Initial contribution of LibPressio ecosystem (spack#32630)

* Add libpressio and dependencies; some of these packages are
  maintained as forks of the original repositories and in those
  cases the docstring mentions this.
* Add optional dependency in adios2 on libpressio
* cub package: set CUB_DIR environment variable for dependent
  installations
* Clear R_HOME/R_ENVIRON before Spack installation (avoid sources
  outside of Spack from affecting the installation in Spack)
* Rename dlib to dorian3d-dlib and update dependents; add new dlib
  implementation. Pending an official policy on how to handle
  packages with short names, reviewer unilaterally decided that
  the rename was acceptable given that the new Spack dlib package
  is referenced more widely (by orders of magnitude) than the
  original

Co-authored-by: Samuel Li <shaomeng@users.noreply.github.com>

* New packages: libbigwig, methyldackel (spack#33273)

* libbigwig: adding new package libbigwig
* methyldackel: adding new package methyldackel
* libbigwig: tighten up curl variant

* grid: reference `fftw-api` instead of `fftw` (spack#33374)

This makes it possible to compile with, e.g., `cray-fftw`, not just `fftw`.

* Bugfix HIP and aluminum rocm build (spack#33344)

* Fixed two bugs in the HIP package recipe.  The first is that the
HIP_PATH was being set to the actual spec, and not the spec prefix.

The second bug is that HIP is expected to be in /opt/rocm-x.y.z/hip
but it's libraries can exist at both /opt/rocm-x.y.z/hip/lib and
/opt/rocm-x.y.z/lib.  This means that the external detection logic may
find it in either and it turns out that some modules only expose one
of those two locations.  Logic is added to ensure that the internal
HIP_PATH and associated ROCM_PATH are correctly set in both cases.

* Added support for Aluminum to use the libfabric plugin with either
RCCL or NCCL.

* lz4: add 1.9.4 (spack#33334)

* e4s ci stack: add trilinos +rocm (spack#31601)

* py-fiona: add v1.8.22 (spack#33372)

* intel-oneapi-compilers: fix Python 2.7 compliance (spack#33383)

* build(deps): bump docker/setup-buildx-action from 2.1.0 to 2.2.0 (spack#33384)

Bumps [docker/setup-buildx-action](https://github.com/docker/setup-buildx-action) from 2.1.0 to 2.2.0.
- [Release notes](https://github.com/docker/setup-buildx-action/releases)
- [Commits](docker/setup-buildx-action@95cb08c...c74574e)

---
updated-dependencies:
- dependency-name: docker/setup-buildx-action
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* mothur: add v1.48.0 and variants (spack#33326)

* vsearch: add v2.22.1 (spack#33327)

* papi: fix for Intel OneAPI compiler (spack#33225)

Without this patch one hits this error trying to compiler papi with Intel OneAPI:

icx: error: Note that use of '-g' without any optimization-level option will turn off most compiler optimizations similar to use of '-O0' [-Werror,-Wdebug-disables-optimization]

Signed-off-by: Howard Pritchard <howardp@lanl.gov>

Signed-off-by: Howard Pritchard <howardp@lanl.gov>

* go,gcc: Support external go compilers for Go bootstrap (spack#27769)

For ARM64, fallback to gccgo. (go-bootstrap@1.4 can't support ARM64)

* Reusable --use-buildcache with better validation (spack#33388)

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Docs: Spack info option updates (spack#33376)

* intel-oneapi-compilers: do not pass -Wno-unused-command-line-argument to icc + refactor (spack#33389)

* spack checksum: warn if version is deprecated (spack#32438)

Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>

* Update the binary index before attempting direct fetches (spack#32137)

"spack install" will not update the binary index if given a concrete
spec, which causes it to fall back to direct fetches when a simple
index update would have helped. For S3 buckets in particular, this
significantly and needlessly slows down the install process.

This commit alters the logic so that the binary index is updated
whenever a by-hash lookup fails. The lookup is attempted again with
the updated index before falling back to direct fetches. To avoid
updating too frequently (potentially once for each spec being
installed), BinaryCacheIndex.update now includes a "cooldown"
option, and when this option is enabled it will not update more
than once in a cooldown window (set in config.yaml).

Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>

* Relocate "run" type dependencies too (spack#33191)

When downloading from binary cache not only replace RPATHs to dependencies, but
also text references to dependencies.

Example:
`autoconf@2.69` contains a text reference to the executable of its dependency
`perl`:

```
$ grep perl-5 /shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-7.3.1/autoconf-2.69-q3lo/bin/autoreconf
eval 'case $# in 0) exec /shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-7.3.1/perl-5.34.1-yphg/bin/perl -S "$0";; *) exec /shared/spack/opt/spack/linux-amzn2-x86_64_v3/gcc-7.3.1/perl-5.34.1-yphg/bin/perl -S "$0" "$@";; esac'
```

These references need to be replace or any package using `autoreconf` will fail
as it cannot find the installed `perl`.

Co-authored-by: Stephen Sachs <stesachs@amazon.com>

* Add a command to bootstrap Spack right now (spack#33407)

* axom: python only reliably available when +python, +devtools (spack#33414)

* pilercr: new package (spack#33251)

* new package
* fixed style
* actually building now

* raja +rocm: use hipcc as CMAKE_CXX_COMPILER (spack#33375)

Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: Howard Pritchard <howardp@lanl.gov>
Co-authored-by: Rui Peng Li <li50@llnl.gov>
Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
Co-authored-by: Carlos Bederián <carlos.bederian@unc.edu.ar>
Co-authored-by: iarspider <iarspider@gmail.com>
Co-authored-by: Massimiliano Culpo <massimiliano.culpo@gmail.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Stephen Sachs <stephenmsachs@gmail.com>
Co-authored-by: Stephen Sachs <stesachs@amazon.com>
Co-authored-by: Robert Underwood <robertu94@users.noreply.github.com>
Co-authored-by: Samuel Li <shaomeng@users.noreply.github.com>
Co-authored-by: snehring <7978778+snehring@users.noreply.github.com>
Co-authored-by: Mosè Giordano <giordano@users.noreply.github.com>
Co-authored-by: Brian Van Essen <vanessen1@llnl.gov>
Co-authored-by: Michael Kuhn <michael.kuhn@ovgu.de>
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Howard Pritchard <howardp@lanl.gov>
Co-authored-by: Bernhard Kaindl <43588962+bernhardkaindl@users.noreply.github.com>
Co-authored-by: Tamara Dahlgren <35777542+tldahlgren@users.noreply.github.com>
Co-authored-by: Robert Cohn <robert.s.cohn@intel.com>
Co-authored-by: Jonathon Anderson <17242663+blue42u@users.noreply.github.com>
  • Loading branch information
1 parent f34150f commit 69d3dfb
Show file tree
Hide file tree
Showing 92 changed files with 2,105 additions and 212 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/build-containers.yml
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ jobs:
uses: docker/setup-qemu-action@8b122486cedac8393e77aa9734c3528886e4a1a8 # @v1

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@95cb08cb2672c73d4ffd2f422e6d11953d2a9c70 # @v1
uses: docker/setup-buildx-action@c74574e6c82eeedc46366be1b0d287eff9085eb6 # @v1

- name: Log in to GitHub Container Registry
uses: docker/login-action@f4ef78c080cd8ba55a85445d5b36e214a81df20a # @v1
Expand Down
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ Resources:

* **Slack workspace**: [spackpm.slack.com](https://spackpm.slack.com).
To get an invitation, visit [slack.spack.io](https://slack.spack.io).
* [**Github Discussions**](https://github.com/spack/spack/discussions): not just for discussions, also Q&A.
* **Mailing list**: [groups.google.com/d/forum/spack](https://groups.google.com/d/forum/spack)
* **Twitter**: [@spackpm](https://twitter.com/spackpm). Be sure to
`@mention` us!
Expand Down
4 changes: 4 additions & 0 deletions etc/spack/defaults/config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -201,3 +201,7 @@ config:
# building and installing packages. This gives information about Spack's
# current progress as well as the current and total number of packages.
terminal_title: false

# Number of seconds a buildcache's index.json is cached locally before probing
# for updates, within a single Spack invocation. Defaults to 10 minutes.
binary_index_ttl: 600
3 changes: 2 additions & 1 deletion etc/spack/defaults/packages.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,8 @@ packages:
fuse: [libfuse]
gl: [glx, osmesa]
glu: [mesa-glu, openglu]
golang: [gcc]
golang: [go, gcc]
go-external-or-gccgo-bootstrap: [go-bootstrap, gcc]
iconv: [libiconv]
ipp: [intel-ipp]
java: [openjdk, jdk, ibm-java]
Expand Down
2 changes: 1 addition & 1 deletion lib/spack/docs/basic_usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ All packages whose names or descriptions contain documentation:
To get more information on a particular package from `spack list`, use
`spack info`. Just supply the name of a package:

.. command-output:: spack info mpich
.. command-output:: spack info --all mpich

Most of the information is self-explanatory. The *safe versions* are
versions that Spack knows the checksum for, and it will use the
Expand Down
2 changes: 1 addition & 1 deletion lib/spack/docs/build_systems/inteloneapipackage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ oneAPI packages or use::

For more information on a specific package, do::

spack info <package-name>
spack info --all <package-name>

Intel no longer releases new versions of Parallel Studio, which can be
used in Spack via the :ref:`intelpackage`. All of its components can
Expand Down
2 changes: 2 additions & 0 deletions lib/spack/docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -206,6 +206,8 @@ def setup(sphinx):
# Spack classes that are private and we don't want to expose
("py:class", "spack.provider_index._IndexBase"),
("py:class", "spack.repo._PrependFileLoader"),
# Spack classes that intersphinx is unable to resolve
("py:class", "spack.version.VersionBase"),
]

# The reST default role (used for this markup: `text`) to use for all documents.
Expand Down
33 changes: 25 additions & 8 deletions lib/spack/docs/packaging_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3369,27 +3369,44 @@ The name and order in which the phases will be executed can be obtained either r
docs at :py:mod:`~.spack.build_systems`, or using the ``spack info`` command:

.. code-block:: console
:emphasize-lines: 13,14
:emphasize-lines: 26-27
$ spack info m4
AutotoolsPackage: m4
Homepage: https://www.gnu.org/software/m4/m4.html
$ spack info --phases m4
AutotoolsPackage: m4
Description:
GNU M4 is an implementation of the traditional Unix macro processor.
Homepage: https://www.gnu.org/software/m4/m4.html
Preferred version:
1.4.19 https://ftpmirror.gnu.org/m4/m4-1.4.19.tar.gz
Safe versions:
1.4.17 ftp://ftp.gnu.org/gnu/m4/m4-1.4.17.tar.gz
1.4.19 https://ftpmirror.gnu.org/m4/m4-1.4.19.tar.gz
1.4.18 https://ftpmirror.gnu.org/m4/m4-1.4.18.tar.gz
1.4.17 https://ftpmirror.gnu.org/m4/m4-1.4.17.tar.gz
Deprecated versions:
None
Variants:
Name Default Description
Name [Default] When Allowed values Description
============== ==== ============== ===============================
sigsegv on Build the libsigsegv dependency
sigsegv [on] -- on, off Build the libsigsegv dependency
Installation Phases:
autoreconf configure build install
Build Dependencies:
diffutils gnuconfig libsigsegv
Link Dependencies:
libsigsegv
...
Run Dependencies:
None
Typically, phases have default implementations that fit most of the common cases:
Expand Down
57 changes: 40 additions & 17 deletions lib/spack/spack/binary_distribution.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
import sys
import tarfile
import tempfile
import time
import traceback
import warnings
from contextlib import closing
Expand Down Expand Up @@ -105,6 +106,9 @@ def __init__(self, cache_root):
# cache (_mirrors_for_spec)
self._specs_already_associated = set()

# mapping from mirror urls to the time.time() of the last index fetch.
self._last_fetch_times = {}

# _mirrors_for_spec is a dictionary mapping DAG hashes to lists of
# entries indicating mirrors where that concrete spec can be found.
# Each entry is a dictionary consisting of:
Expand Down Expand Up @@ -137,6 +141,7 @@ def clear(self):
self._index_file_cache = None
self._local_index_cache = None
self._specs_already_associated = set()
self._last_fetch_times = {}
self._mirrors_for_spec = {}

def _write_local_index_cache(self):
Expand Down Expand Up @@ -242,7 +247,6 @@ def find_built_spec(self, spec, mirrors_to_check=None):
}
]
"""
self.regenerate_spec_cache()
return self.find_by_hash(spec.dag_hash(), mirrors_to_check=mirrors_to_check)

def find_by_hash(self, find_hash, mirrors_to_check=None):
Expand All @@ -253,6 +257,9 @@ def find_by_hash(self, find_hash, mirrors_to_check=None):
mirrors_to_check: Optional mapping containing mirrors to check. If
None, just assumes all configured mirrors.
"""
if find_hash not in self._mirrors_for_spec:
# Not found in the cached index, pull the latest from the server.
self.update(with_cooldown=True)
if find_hash not in self._mirrors_for_spec:
return None
results = self._mirrors_for_spec[find_hash]
Expand Down Expand Up @@ -283,7 +290,7 @@ def update_spec(self, spec, found_list):
"spec": new_entry["spec"],
}

def update(self):
def update(self, with_cooldown=False):
"""Make sure local cache of buildcache index files is up to date.
If the same mirrors are configured as the last time this was called
and none of the remote buildcache indices have changed, calling this
Expand Down Expand Up @@ -325,24 +332,37 @@ def update(self):

fetch_errors = []
all_methods_failed = True
ttl = spack.config.get("config:binary_index_ttl", 600)
now = time.time()

for cached_mirror_url in self._local_index_cache:
cache_entry = self._local_index_cache[cached_mirror_url]
cached_index_hash = cache_entry["index_hash"]
cached_index_path = cache_entry["index_path"]
if cached_mirror_url in configured_mirror_urls:
# May need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(
cached_mirror_url, expect_hash=cached_index_hash
)
# Only do a fetch if the last fetch was longer than TTL ago
if (
with_cooldown
and ttl > 0
and cached_mirror_url in self._last_fetch_times
and now - self._last_fetch_times[cached_mirror_url] < ttl
):
# The fetch worked last time, so don't error
all_methods_failed = False
except FetchCacheError as fetch_error:
needs_regen = False
fetch_errors.extend(fetch_error.errors)
# The need to regenerate implies a need to clear as well.
spec_cache_clear_needed |= needs_regen
spec_cache_regenerate_needed |= needs_regen
else:
# May need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(
cached_mirror_url, expect_hash=cached_index_hash
)
self._last_fetch_times[cached_mirror_url] = now
all_methods_failed = False
except FetchCacheError as fetch_error:
needs_regen = False
fetch_errors.extend(fetch_error.errors)
# The need to regenerate implies a need to clear as well.
spec_cache_clear_needed |= needs_regen
spec_cache_regenerate_needed |= needs_regen
else:
# No longer have this mirror, cached index should be removed
items_to_remove.append(
Expand All @@ -351,6 +371,8 @@ def update(self):
"cache_key": os.path.join(self._index_cache_root, cached_index_path),
}
)
if cached_mirror_url in self._last_fetch_times:
del self._last_fetch_times[cached_mirror_url]
spec_cache_clear_needed = True
spec_cache_regenerate_needed = True

Expand All @@ -369,6 +391,7 @@ def update(self):
# Need to fetch the index and update the local caches
try:
needs_regen = self._fetch_and_cache_index(mirror_url)
self._last_fetch_times[mirror_url] = now
all_methods_failed = False
except FetchCacheError as fetch_error:
fetch_errors.extend(fetch_error.errors)
Expand Down Expand Up @@ -698,7 +721,7 @@ def write_buildinfo_file(spec, workdir, rel=False):
prefix_to_hash = dict()
prefix_to_hash[str(spec.package.prefix)] = spec.dag_hash()
deps = spack.build_environment.get_rpath_deps(spec.package)
for d in deps:
for d in deps + spec.dependencies(deptype="run"):
prefix_to_hash[str(d.prefix)] = d.dag_hash()

# Create buildinfo data and write it to disk
Expand Down Expand Up @@ -1451,7 +1474,7 @@ def relocate_package(spec, allow_root):
hash_to_prefix = dict()
hash_to_prefix[spec.format("{hash}")] = str(spec.package.prefix)
new_deps = spack.build_environment.get_rpath_deps(spec.package)
for d in new_deps:
for d in new_deps + spec.dependencies(deptype="run"):
hash_to_prefix[d.format("{hash}")] = str(d.prefix)
# Spurious replacements (e.g. sbang) will cause issues with binaries
# For example, the new sbang can be longer than the old one.
Expand Down Expand Up @@ -1878,8 +1901,8 @@ def get_mirrors_for_spec(spec=None, mirrors_to_check=None, index_only=False):

results = binary_index.find_built_spec(spec, mirrors_to_check=mirrors_to_check)

# Maybe we just didn't have the latest information from the mirror, so
# try to fetch directly, unless we are only considering the indices.
# The index may be out-of-date. If we aren't only considering indices, try
# to fetch directly since we know where the file should be.
if not results and not index_only:
results = try_direct_fetch(spec, mirrors=mirrors_to_check)
# We found a spec by the direct fetch approach, we might as well
Expand Down
2 changes: 2 additions & 0 deletions lib/spack/spack/build_environment.py
Original file line number Diff line number Diff line change
Expand Up @@ -201,6 +201,8 @@ def clean_environment():

env.unset("CMAKE_PREFIX_PATH")
env.unset("PYTHONPATH")
env.unset("R_HOME")
env.unset("R_ENVIRON")

# Affects GNU make, can e.g. indirectly inhibit enabling parallel build
# env.unset('MAKEFLAGS')
Expand Down
34 changes: 30 additions & 4 deletions lib/spack/spack/ci.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
import shutil
import stat
import subprocess
import sys
import tempfile
import time
import zipfile
Expand Down Expand Up @@ -626,11 +627,11 @@ def generate_gitlab_ci_yaml(
cdash_handler = CDashHandler(yaml_root.get("cdash")) if "cdash" in yaml_root else None
build_group = cdash_handler.build_group if cdash_handler else None

prune_untouched_packages = os.environ.get("SPACK_PRUNE_UNTOUCHED", None)
if prune_untouched_packages:
prune_untouched_packages = False
spack_prune_untouched = os.environ.get("SPACK_PRUNE_UNTOUCHED", None)
if spack_prune_untouched is not None and spack_prune_untouched.lower() == "true":
# Requested to prune untouched packages, but assume we won't do that
# unless we're actually in a git repo.
prune_untouched_packages = False
rev1, rev2 = get_change_revisions()
tty.debug("Got following revisions: rev1={0}, rev2={1}".format(rev1, rev2))
if rev1 and rev2:
Expand All @@ -646,6 +647,14 @@ def generate_gitlab_ci_yaml(
for s in affected_specs:
tty.debug(" {0}".format(s.name))

# Allow overriding --prune-dag cli opt with environment variable
prune_dag_override = os.environ.get("SPACK_PRUNE_UP_TO_DATE", None)
if prune_dag_override is not None:
prune_dag = True if prune_dag_override.lower() == "true" else False

# If we are not doing any kind of pruning, we are rebuilding everything
rebuild_everything = not prune_dag and not prune_untouched_packages

# Downstream jobs will "need" (depend on, for both scheduling and
# artifacts, which include spack.lock file) this pipeline generation
# job by both name and pipeline id. If those environment variables
Expand Down Expand Up @@ -1298,6 +1307,8 @@ def generate_gitlab_ci_yaml(
"SPACK_LOCAL_MIRROR_DIR": rel_local_mirror_dir,
"SPACK_PIPELINE_TYPE": str(spack_pipeline_type),
"SPACK_CI_STACK_NAME": os.environ.get("SPACK_CI_STACK_NAME", "None"),
"SPACK_REBUILD_CHECK_UP_TO_DATE": str(prune_dag),
"SPACK_REBUILD_EVERYTHING": str(rebuild_everything),
}

if remote_mirror_override:
Expand Down Expand Up @@ -1357,7 +1368,9 @@ def generate_gitlab_ci_yaml(
if known_broken_specs_encountered:
tty.error("This pipeline generated hashes known to be broken on develop:")
display_broken_spec_messages(broken_specs_url, known_broken_specs_encountered)
tty.die()

if not rebuild_everything:
sys.exit(1)

with open(output_file, "w") as outf:
outf.write(syaml.dump_config(sorted_output, default_flow_style=True))
Expand Down Expand Up @@ -1575,6 +1588,19 @@ def push_mirror_contents(env, specfile_path, mirror_url, sign_binaries):
raise inst


def remove_other_mirrors(mirrors_to_keep, scope=None):
"""Remove all mirrors from the given config scope, the exceptions being
any listed in in mirrors_to_keep, which is a list of mirror urls.
"""
mirrors_to_remove = []
for name, mirror_url in spack.config.get("mirrors", scope=scope).items():
if mirror_url not in mirrors_to_keep:
mirrors_to_remove.append(name)

for mirror_name in mirrors_to_remove:
spack.mirror.remove(mirror_name, scope)


def copy_files_to_artifacts(src, artifacts_dir):
"""
Copy file(s) to the given artifacts directory
Expand Down
12 changes: 12 additions & 0 deletions lib/spack/spack/cmd/bootstrap.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
from __future__ import print_function

import os.path
import platform
import shutil
import tempfile

Expand Down Expand Up @@ -73,6 +74,8 @@ def _add_scope_option(parser):
def setup_parser(subparser):
sp = subparser.add_subparsers(dest="subcommand")

sp.add_parser("now", help="Spack ready, right now!")

status = sp.add_parser("status", help="get the status of Spack")
status.add_argument(
"--optional",
Expand Down Expand Up @@ -404,6 +407,14 @@ def write_metadata(subdir, metadata):
print(instructions)


def _now(args):
with spack.bootstrap.ensure_bootstrap_configuration():
spack.bootstrap.ensure_clingo_importable_or_raise()
spack.bootstrap.ensure_gpg_in_path_or_raise()
if platform.system().lower() == "linux":
spack.bootstrap.ensure_patchelf_in_path_or_raise()


def bootstrap(parser, args):
callbacks = {
"status": _status,
Expand All @@ -417,5 +428,6 @@ def bootstrap(parser, args):
"add": _add,
"remove": _remove,
"mirror": _mirror,
"now": _now,
}
callbacks[args.subcommand](args)
Loading

0 comments on commit 69d3dfb

Please sign in to comment.