You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
climbfuji opened this issue
Oct 1, 2022
· 4 comments
Assignees
Labels
EpicFor planning and administrationINFRAJEDI InfrastructureNAVYUnited States Naval Research LabNOAA-EMCOAR-EPICNOAA Oceanic and Atmospheric Research and Earth Prediction Innovation Center
Is your feature request related to a problem? Please describe.
Installing spack environments is slow, because everything gets compiled from source.
Describe the solution you'd like spack binary caches allow caching installed packages locally and reuse them if the (unique) hash for the package is the same. This can greatly reduce the time to install environments, from hours to minutes, depending on how many packages have changed.
Additional context
The binary caches serve as a second local cache, in addition to the spack mirrors (see #364) to reduce network traffic and speed up the install process.
Test build caching based on 1.4.0 installations on the following systems (by creating the cache based on unified-env, install a new copy wherever, and time it):
We made significant progress on this in the first quarter of 2023. We are now using (simplified versions of) build caches for the CI tests running on self-hosted Github runners, which brought the build times down from over 6 hours to less than 30 minutes for the entire unified-dev environment.
There are a few issues that still need to be sorted out and that have to do with package relocation and the use of config:install_tree:padded_length:VALUE. We are avoiding this right now in the CI tests by always building in the same directory (i.e. using the same environment name) so that we don't have to pad the path when creating build caches.
I don't think it's realistic to have all this fleshed out so that we can roll it out on the HPCs and user systems, and have a hierarchical tree of binary caches in place (local, upstream on S3, ...). Therefore I defer completing this task to the next quarter / spack-stack release 1.4.0 by end of June 2023.
@ulmononian@mark-a-potts per our discussion today can you test out creating/using build caches on, say, Hera and Cheyenne and add those systems to the list at the top of this issue? The test installation doesn't need to be in an official space for now, as I think the next step will be to nail down locations.
This is going to get addressed in the next few weeks as part of our automation efforts - the issue here covers basically all supported platforms, not just the CI runner.
EpicFor planning and administrationINFRAJEDI InfrastructureNAVYUnited States Naval Research LabNOAA-EMCOAR-EPICNOAA Oceanic and Atmospheric Research and Earth Prediction Innovation Center
Is your feature request related to a problem? Please describe.
Installing spack environments is slow, because everything gets compiled from source.
Describe the solution you'd like
spack
binary caches allow caching installed packages locally and reuse them if the (unique) hash for the package is the same. This can greatly reduce the time to install environments, from hours to minutes, depending on how many packages have changed.Additional context
The binary caches serve as a second local cache, in addition to the spack mirrors (see #364) to reduce network traffic and speed up the install process.
Test build caching based on 1.4.0 installations on the following systems (by creating the cache based on unified-env, install a new copy wherever, and time it):
The text was updated successfully, but these errors were encountered: