Skip to content

Commit

Permalink
merge with upstream (#7)
Browse files Browse the repository at this point in the history
* py-execnet: 1.9.0 (spack#33282)

* py-execnet: 1.9.0

* bounds

* add version 1.12.3 of parallel-netcdf (spack#33286)

* Add checksum for py-psutil 5.9.2 (spack#33139)

* gitlab ci: Print better information about broken specs (spack#33124)

When a pipeline generation job is automatically failed because it
generated jobs for specs known to be broken on develop, print better
information about the broken specs that were encountered.  Include
at a minimum the hash and the url of the job whose failure caused it
to be put on the broken specs list in the first place.

* meson: remove slash in path (spack#33292)

* Add checksum for py-gitpython 3.1.27 (spack#33285)

* Add checksum for py-gitpython 3.1.27

* Update package.py

* Update var/spack/repos/builtin/packages/py-gitpython/package.py

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>

* UPC++/GASNet-EX 2022.9.0 update (spack#33277)

* gasnet: Add new release hash
* upcxx: Add new release hash
* gasnet: misc updates
* upcxx: misc updates

* Fix pika@0.9.0 sha (spack#33307)

* hip@5.2.0 onwards: set prefix properly (spack#33257)

* hip-set-prefix-rocm5.2.0-onwards

* Update var/spack/repos/builtin/packages/hip/package.py

Update description

Co-authored-by: Satish Balay <balay@mcs.anl.gov>

* petsc,py-petsc4py,slepc,py-slepc4py: add version 3.18.0 (spack#32938)

* petsc,py-petsc4py,slepc,py-slepc4py: add version 3.18.0

* workaround for dealii build failure [with petsc version check]

* pism: add compatibility fix to for petsc@3.18

* add in hipsolver dependency

* py-libensemble: updating package for v0.9.3 (spack#33298)

* commit updating py-libensemble package for 0.9.3

* removed commented-out lines

* Add checksum for py-oauthlib 3.2.1 (spack#33201)

* Add checksum for py-oauthlib 3.2.1

* Update package.py

* [@spackbot] updating style on behalf of iarspider

* Update package.py

* Update package.py

Co-authored-by: iarspider <iarspider@users.noreply.github.com>

* ninja: New version 1.11.1 (spack#33215)

* seacas: update to latest release (spack#33330)

Add checksum for latest tag/release

* gptl: new version 8.1.1; use the correct mpi fortran compiler (spack#33235)

* gptl: new version 8.1.1; use the correct `mpifc`
* add `F90` and `$F77`

* glib: add 2.74.0 and 2.72.4 (spack#33332)

* depfile: update docs (spack#33279)

* rocksdb: add 7.7.3 (spack#33341)

* Add checksum for py-seaborn 0.12.0 (spack#33145)

* Add checksum for py-seaborn 0.12.0

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of iarspider

* Update package.py

Co-authored-by: iarspider <iarspider@users.noreply.github.com>

* CI: allow multiple matches to combine tags (spack#32290)

Currently "spack ci generate" chooses the first matching entry in
gitlab-ci:mappings to fill attributes for a generated build-job,
requiring that the entire configuration matrix is listed out
explicitly. This unfortunately causes significant problems in
environments with large configuration spaces, for example the
environment in spack#31598 (spack.yaml) supports 5 operating systems,
3 architectures and 130 packages with explicit size requirements,
resulting in 1300 lines of configuration YAML.

This patch adds a configuraiton option to the gitlab-ci schema called
"match_behavior"; when it is set to "merge", all matching entries
are applied in order to the final build-job, allowing a few entries
to cover an entire matrix of configurations.

The default for "match_behavior" is "first", which behaves as before
this commit (only the runner attributes of the first match are used).

In addition, match entries may now include a "remove-attributes"
configuration, which allows matches to remove tags that have been
aggregated by prior matches. This only makes sense to use with
"match_behavior:merge". You can combine "runner-attributes" with
"remove-attributes" to effectively override prior tags.

* meson: update OneAPI compiler support patch (spack#33293)

* py-tensorflow: fix zlib (spack#33349)

* py-tensorflow: fix zlib

* [@spackbot] updating style on behalf of haampie

Co-authored-by: haampie <haampie@users.noreply.github.com>

* py-meson-python: add new versions (spack#33294)

* tasmanian: disable openmp by default (spack#33345)

* octopus: upgrade to 12.1 (spack#33343)

* py-sphinx: add v5.3 and v5.2 (spack#33356)

* py-setuptools: add v65.5.0 (spack#33353)

* libblastrampoline: Add versions 5.1.1, 5.2.0 (spack#33352)

* nextflow: add v20.10.0 (spack#33354)

* sdl2: add v2.0.22 and v2.24.1 (spack#33351)

* mariadb-c-client: add 3.3.2, 3.2.7, 3.1.18, 3.0.10 (spack#33335)

* py-tensorflow-hub: zlib, again. (spack#33359)

* Add checksum for py-sniffio 1.3.0 (spack#32975)

* py-numpy: add v1.23.4 (spack#33260)

* py-jupyterlab-pygments: install from wheel to avoid cyclic dependency (spack#33278)

* py-jupyterlab-pygments: avoid cyclic dependency

* Fix style

* Update package.py

* Update package.py

* [@spackbot] updating style on behalf of iarspider

* Update package.py

* Flake-8

* fix

Co-authored-by: iarspider <iarspider@users.noreply.github.com>

* installer.py: show timers for binary install (spack#33305)

Print a message of the form
```
Fetch mm:ss.  Build: mm:ss.  Total: mm:ss
```
when installing from buildcache. 

Previously this only happened for source builds.

* Add checksum for py-astroid 2.12.7, py-astroid 2.12.10, py-setuptools 62.6.0, py-wrapt 1.14.1, py-pylint 2.15.0 (spack#32976)

* Add checksum for py-astroid 2.12.7, py-setuptools 62.6.0

* Also add checksum for py-wrapt

* Update package.py

* Update package.py (spack#57)

* Update package.py

* Update package.py

* Update package.py

* gitlab ci: Do not force protected build jobs to run on aws runners (spack#33314)

* installer.py: fix/test get_deptypes (spack#33363)

Fixing an oversight in spack#32537

`get_deptypes` should depend on new `package/dependencies_cache_only`
props.

* Add checksum for py-grpcio-tools 1.48.1 (spack#33358)

* Add checksum for py-prompt-toolkit 3.0.31 (spack#33362)

Co-authored-by: Harmen Stoppels <harmenstoppels@gmail.com>
Co-authored-by: Jim Edwards <jedwards@ucar.edu>
Co-authored-by: iarspider <iarspider@gmail.com>
Co-authored-by: Scott Wittenburg <scott.wittenburg@kitware.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Co-authored-by: Dan Bonachea <dobonachea@lbl.gov>
Co-authored-by: Auriane R <48684432+aurianer@users.noreply.github.com>
Co-authored-by: eugeneswalker <38933153+eugeneswalker@users.noreply.github.com>
Co-authored-by: Satish Balay <balay@mcs.anl.gov>
Co-authored-by: John-Luke Navarro <33131245+jlnav@users.noreply.github.com>
Co-authored-by: iarspider <iarspider@users.noreply.github.com>
Co-authored-by: Erik Schnetter <schnetter@gmail.com>
Co-authored-by: Greg Sjaardema <gsjaardema@gmail.com>
Co-authored-by: WuK <i@wu-kan.cn>
Co-authored-by: Michael Kuhn <michael.kuhn@ovgu.de>
Co-authored-by: Jonathon Anderson <17242663+blue42u@users.noreply.github.com>
Co-authored-by: haampie <haampie@users.noreply.github.com>
Co-authored-by: Miroslav Stoyanov <30537612+mkstoyanov@users.noreply.github.com>
Co-authored-by: Hans Fangohr <fangohr@users.noreply.github.com>
Co-authored-by: Mosè Giordano <giordano@users.noreply.github.com>
Co-authored-by: Diego Alvarez <dialvarezs@gmail.com>
Co-authored-by: Wouter Deconinck <wdconinc@gmail.com>
  • Loading branch information
1 parent bd90ff8 commit e552c78
Show file tree
Hide file tree
Showing 71 changed files with 814 additions and 177 deletions.
46 changes: 37 additions & 9 deletions lib/spack/docs/environments.rst
Original file line number Diff line number Diff line change
Expand Up @@ -986,7 +986,7 @@ A typical workflow is as follows:
spack env create -d .
spack -e . add perl
spack -e . concretize
spack -e . env depfile > Makefile
spack -e . env depfile -o Makefile
make -j64
This generates a ``Makefile`` from a concretized environment in the
Expand All @@ -999,7 +999,6 @@ load, even when packages are built in parallel.
By default the following phony convenience targets are available:

- ``make all``: installs the environment (default target);
- ``make fetch-all``: only fetch sources of all packages;
- ``make clean``: cleans files used by make, but does not uninstall packages.

.. tip::
Expand All @@ -1009,8 +1008,17 @@ By default the following phony convenience targets are available:
printed orderly per package install. To get synchronized output with colors,
use ``make -j<N> SPACK_COLOR=always --output-sync=recurse``.

The following advanced example shows how generated targets can be used in a
``Makefile``:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Specifying dependencies on generated ``make`` targets
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

An interesting question is how to include generated ``Makefile``\s in your own
``Makefile``\s. This comes up when you want to install an environment that provides
executables required in a command for a make target of your own.

The example below shows how to accomplish this: the ``env`` target specifies
the generated ``spack/env`` target as a prerequisite, meaning that the environment
gets installed and is available for use in the ``env`` target.

.. code:: Makefile
Expand All @@ -1036,11 +1044,10 @@ The following advanced example shows how generated targets can be used in a
include env.mk
endif
When ``make`` is invoked, it first "remakes" the missing include ``env.mk``
from its rule, which triggers concretization. When done, the generated target
``spack/env`` is available. In the above example, the ``env`` target uses this generated
target as a prerequisite, meaning that it can make use of the installed packages in
its commands.
This works as follows: when ``make`` is invoked, it first "remakes" the missing
include ``env.mk`` as there is a target for it. This triggers concretization of
the environment and makes spack output ``env.mk``. At that point the
generated target ``spack/env`` becomes available through ``include env.mk``.

As it is typically undesirable to remake ``env.mk`` as part of ``make clean``,
the include is conditional.
Expand All @@ -1051,3 +1058,24 @@ the include is conditional.
the ``--make-target-prefix`` flag and use the non-phony target
``<target-prefix>/env`` as prerequisite, instead of the phony target
``<target-prefix>/all``.

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Building a subset of the environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

The generated ``Makefile``\s contain install targets for each spec. Given the hash
of a particular spec, you can use the ``.install/<hash>`` target to install the
spec with its dependencies. There is also ``.install-deps/<hash>`` to *only* install
its dependencies. This can be useful when certain flags should only apply to
dependencies. Below we show a use case where a spec is installed with verbose
output (``spack install --verbose``) while its dependencies are installed silently:

.. code:: console
$ spack env depfile -o Makefile --make-target-prefix my_env
# Install dependencies in parallel, only show a log on error.
$ make -j16 my_env/.install-deps/<hash> SPACK_INSTALL_FLAGS=--show-log-on-error
# Install the root spec with verbose output.
$ make -j16 my_env/.install/<hash> SPACK_INSTALL_FLAGS=--verbose
2 changes: 1 addition & 1 deletion lib/spack/spack/build_systems/meson.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ class MesonPackage(PackageBase):
@property
def archive_files(self):
"""Files to archive for packages based on Meson"""
return [os.path.join(self.build_directory, "meson-logs/meson-log.txt")]
return [os.path.join(self.build_directory, "meson-logs", "meson-log.txt")]

@property
def root_mesonlists_dir(self):
Expand Down
111 changes: 94 additions & 17 deletions lib/spack/spack/ci.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
# SPDX-License-Identifier: (Apache-2.0 OR MIT)

import base64
import codecs
import copy
import json
import os
Expand Down Expand Up @@ -396,6 +397,14 @@ def _spec_matches(spec, match_string):
return spec.satisfies(match_string)


def _remove_attributes(src_dict, dest_dict):
if "tags" in src_dict and "tags" in dest_dict:
# For 'tags', we remove any tags that are listed for removal
for tag in src_dict["tags"]:
while tag in dest_dict["tags"]:
dest_dict["tags"].remove(tag)


def _copy_attributes(attrs_list, src_dict, dest_dict):
for runner_attr in attrs_list:
if runner_attr in src_dict:
Expand Down Expand Up @@ -429,19 +438,23 @@ def _find_matching_config(spec, gitlab_ci):

_copy_attributes(overridable_attrs, gitlab_ci, runner_attributes)

ci_mappings = gitlab_ci["mappings"]
for ci_mapping in ci_mappings:
matched = False
only_first = gitlab_ci.get("match_behavior", "first") == "first"
for ci_mapping in gitlab_ci["mappings"]:
for match_string in ci_mapping["match"]:
if _spec_matches(spec, match_string):
matched = True
if "remove-attributes" in ci_mapping:
_remove_attributes(ci_mapping["remove-attributes"], runner_attributes)
if "runner-attributes" in ci_mapping:
_copy_attributes(
overridable_attrs, ci_mapping["runner-attributes"], runner_attributes
)
return runner_attributes
else:
return None
break
if matched and only_first:
break

return runner_attributes
return runner_attributes if matched else None


def _pkg_name_from_spec_label(spec_label):
Expand Down Expand Up @@ -865,7 +878,7 @@ def generate_gitlab_ci_yaml(
# For spack pipelines "public" and "protected" are reserved tags
tags = _remove_reserved_tags(tags)
if spack_pipeline_type == "spack_protected_branch":
tags.extend(["aws", "protected"])
tags.extend(["protected"])
elif spack_pipeline_type == "spack_pull_request":
tags.extend(["public"])

Expand Down Expand Up @@ -1021,9 +1034,7 @@ def generate_gitlab_ci_yaml(
continue

if broken_spec_urls is not None and release_spec_dag_hash in broken_spec_urls:
known_broken_specs_encountered.append(
"{0} ({1})".format(release_spec, release_spec_dag_hash)
)
known_broken_specs_encountered.append(release_spec_dag_hash)

# Only keep track of these if we are copying rebuilt cache entries
if spack_buildcache_copy:
Expand Down Expand Up @@ -1286,6 +1297,7 @@ def generate_gitlab_ci_yaml(
"SPACK_JOB_TEST_DIR": rel_job_test_dir,
"SPACK_LOCAL_MIRROR_DIR": rel_local_mirror_dir,
"SPACK_PIPELINE_TYPE": str(spack_pipeline_type),
"SPACK_CI_STACK_NAME": os.environ.get("SPACK_CI_STACK_NAME", "None"),
}

if remote_mirror_override:
Expand Down Expand Up @@ -1343,13 +1355,9 @@ def generate_gitlab_ci_yaml(
sorted_output = {"no-specs-to-rebuild": noop_job}

if known_broken_specs_encountered:
error_msg = (
"Pipeline generation failed due to the presence of the "
"following specs that are known to be broken in develop:\n"
)
for broken_spec in known_broken_specs_encountered:
error_msg += "* {0}\n".format(broken_spec)
tty.die(error_msg)
tty.error("This pipeline generated hashes known to be broken on develop:")
display_broken_spec_messages(broken_specs_url, known_broken_specs_encountered)
tty.die()

with open(output_file, "w") as outf:
outf.write(syaml.dump_config(sorted_output, default_flow_style=True))
Expand Down Expand Up @@ -2060,6 +2068,75 @@ def create_buildcache(**kwargs):
push_mirror_contents(env, json_path, pipeline_mirror_url, sign_binaries)


def write_broken_spec(url, pkg_name, stack_name, job_url, pipeline_url, spec_dict):
"""Given a url to write to and the details of the failed job, write an entry
in the broken specs list.
"""
tmpdir = tempfile.mkdtemp()
file_path = os.path.join(tmpdir, "broken.txt")

broken_spec_details = {
"broken-spec": {
"job-name": pkg_name,
"job-stack": stack_name,
"job-url": job_url,
"pipeline-url": pipeline_url,
"concrete-spec-dict": spec_dict,
}
}

try:
with open(file_path, "w") as fd:
fd.write(syaml.dump(broken_spec_details))
web_util.push_to_url(
file_path,
url,
keep_original=False,
extra_args={"ContentType": "text/plain"},
)
except Exception as err:
# If there is an S3 error (e.g., access denied or connection
# error), the first non boto-specific class in the exception
# hierarchy is Exception. Just print a warning and return
msg = "Error writing to broken specs list {0}: {1}".format(url, err)
tty.warn(msg)
finally:
shutil.rmtree(tmpdir)


def read_broken_spec(broken_spec_url):
"""Read data from broken specs file located at the url, return as a yaml
object.
"""
try:
_, _, fs = web_util.read_from_url(broken_spec_url)
except (URLError, web_util.SpackWebError, HTTPError):
tty.warn("Unable to read broken spec from {0}".format(broken_spec_url))
return None

broken_spec_contents = codecs.getreader("utf-8")(fs).read()
return syaml.load(broken_spec_contents)


def display_broken_spec_messages(base_url, hashes):
"""Fetch the broken spec file for each of the hashes under the base_url and
print a message with some details about each one.
"""
broken_specs = [(h, read_broken_spec(url_util.join(base_url, h))) for h in hashes]
for spec_hash, broken_spec in [tup for tup in broken_specs if tup[1]]:
details = broken_spec["broken-spec"]
if "job-name" in details:
item_name = "{0}/{1}".format(details["job-name"], spec_hash[:7])
else:
item_name = spec_hash

if "job-stack" in details:
item_name = "{0} (in stack {1})".format(item_name, details["job-stack"])

msg = " {0} was reported broken here: {1}".format(item_name, details["job-url"])
tty.msg(msg)


def run_standalone_tests(**kwargs):
"""Run stand-alone tests on the current spec.
Expand Down
39 changes: 9 additions & 30 deletions lib/spack/spack/cmd/ci.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@
import os
import shutil
import sys
import tempfile

import llnl.util.filesystem as fs
import llnl.util.tty as tty
Expand All @@ -19,7 +18,6 @@
import spack.environment as ev
import spack.hash_types as ht
import spack.mirror
import spack.util.spack_yaml as syaml
import spack.util.url as url_util
import spack.util.web as web_util

Expand Down Expand Up @@ -285,6 +283,7 @@ def ci_rebuild(args):
spack_pipeline_type = get_env_var("SPACK_PIPELINE_TYPE")
remote_mirror_override = get_env_var("SPACK_REMOTE_MIRROR_OVERRIDE")
remote_mirror_url = get_env_var("SPACK_REMOTE_MIRROR_URL")
spack_ci_stack_name = get_env_var("SPACK_CI_STACK_NAME")

# Construct absolute paths relative to current $CI_PROJECT_DIR
ci_project_dir = get_env_var("CI_PROJECT_DIR")
Expand Down Expand Up @@ -547,34 +546,14 @@ def ci_rebuild(args):
dev_fail_hash = job_spec.dag_hash()
broken_spec_path = url_util.join(broken_specs_url, dev_fail_hash)
tty.msg("Reporting broken develop build as: {0}".format(broken_spec_path))
tmpdir = tempfile.mkdtemp()
empty_file_path = os.path.join(tmpdir, "empty.txt")

broken_spec_details = {
"broken-spec": {
"job-url": get_env_var("CI_JOB_URL"),
"pipeline-url": get_env_var("CI_PIPELINE_URL"),
"concrete-spec-dict": job_spec.to_dict(hash=ht.dag_hash),
}
}

try:
with open(empty_file_path, "w") as efd:
efd.write(syaml.dump(broken_spec_details))
web_util.push_to_url(
empty_file_path,
broken_spec_path,
keep_original=False,
extra_args={"ContentType": "text/plain"},
)
except Exception as err:
# If there is an S3 error (e.g., access denied or connection
# error), the first non boto-specific class in the exception
# hierarchy is Exception. Just print a warning and return
msg = "Error writing to broken specs list {0}: {1}".format(broken_spec_path, err)
tty.warn(msg)
finally:
shutil.rmtree(tmpdir)
spack_ci.write_broken_spec(
broken_spec_path,
job_spec_pkg_name,
spack_ci_stack_name,
get_env_var("CI_JOB_URL"),
get_env_var("CI_PIPELINE_URL"),
job_spec.to_dict(hash=ht.dag_hash),
)

# We generated the "spack install ..." command to "--keep-stage", copy
# any logs from the staging directory to artifacts now
Expand Down
Loading

0 comments on commit e552c78

Please sign in to comment.