Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Install CUDA and CUDA-Samples via the bot #381

Closed
wants to merge 13 commits into from

Conversation

ocaisa
Copy link
Member

@ocaisa ocaisa commented Nov 9, 2023

Requires #368

Copy link

eessi-bot bot commented Nov 9, 2023

Instance eessi-bot-mc-aws is configured to build:

  • arch x86_64/generic for repo eessi-hpc.org-2023.06-compat
  • arch x86_64/generic for repo eessi-hpc.org-2023.06-software
  • arch x86_64/intel/haswell for repo eessi-hpc.org-2023.06-compat
  • arch x86_64/intel/haswell for repo eessi-hpc.org-2023.06-software
  • arch x86_64/intel/skylake_avx512 for repo eessi-hpc.org-2023.06-compat
  • arch x86_64/intel/skylake_avx512 for repo eessi-hpc.org-2023.06-software
  • arch x86_64/amd/zen2 for repo eessi-hpc.org-2023.06-compat
  • arch x86_64/amd/zen2 for repo eessi-hpc.org-2023.06-software
  • arch x86_64/amd/zen3 for repo eessi-hpc.org-2023.06-compat
  • arch x86_64/amd/zen3 for repo eessi-hpc.org-2023.06-software
  • arch aarch64/generic for repo eessi-hpc.org-2023.06-compat
  • arch aarch64/generic for repo eessi-hpc.org-2023.06-software
  • arch aarch64/neoverse_n1 for repo eessi-hpc.org-2023.06-compat
  • arch aarch64/neoverse_n1 for repo eessi-hpc.org-2023.06-software
  • arch aarch64/neoverse_v1 for repo eessi-hpc.org-2023.06-compat
  • arch aarch64/neoverse_v1 for repo eessi-hpc.org-2023.06-software

@ocaisa ocaisa mentioned this pull request Nov 9, 2023
5 tasks
@ocaisa
Copy link
Member Author

ocaisa commented Nov 9, 2023

Unfortunately this is tripping over the issue with removing software from the overlay:

  File "/home/rocky/software-layer/eb_hooks.py", line 380, in post_package_cuda
    os.remove(source)
PermissionError: [Errno 13] Permission denied: '/cvmfs/pilot.eessi-hpc.org/versions/2023.06/software/linux/x86_64/intel/skylake_avx512/software/CUDA/12.1.1/version.json'

create_lmodrc.py Outdated Show resolved Hide resolved
eessi-2023.06-eb-4.8.2-2023a.yml Outdated Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
eessi-2023.06-eb-4.8.2-2023a.yml Outdated Show resolved Hide resolved
Co-authored-by: Kenneth Hoste <kenneth.hoste@ugent.be>
EESSI-pilot-install-software.sh Outdated Show resolved Hide resolved
@boegel boegel changed the base branch from 2023.06 to pilot.eessi-hpc.org-2023.06 November 21, 2023 21:20
Co-authored-by: Kenneth Hoste <kenneth.hoste@ugent.be>
@ocaisa
Copy link
Member Author

ocaisa commented Nov 29, 2023

To actually test the final build you will need to have the drivers in place. The below should/may work and at least give you an idea of what is required (PR in the works):

# Change directory to a location under host_injections
mkdir -p /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/host
cd /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/host

# Gather libraries on the host (_must_ be host ldconfig)
ldconfig -p | awk '{print $NF}' > libs.txt
# Allow for the fact that we may be in a container
ls /.singularity.d/libs/* >> libs.txt

# Link the relevant libraries
curl -O https://raw.githubusercontent.com/apptainer/apptainer/main/etc/nvliblist.conf
grep '.so$' nvliblist.conf | xargs -i grep {} libs.txt | xargs -i ln -s {}
# Inject CUDA version into dir
nvidia-smi --query-gpu=driver_version --format=csv,noheader | tail -n1 > version.txt

# Make latest symlink for NVIDIA drivers
cd ..
ln -s host latest

# Make sure the libraries can be found by the EESSI linker
source /cvmfs/pilot.eessi-hpc.org/versions/2023.06/init/bash
host_injection_linker_dir=${EESSI_EPREFIX/versions/host_injections}
mkdir -p $host_injection_linker_dir
cd $host_injection_linker_dir
ln -s /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/latest lib 

@boegel
Copy link
Contributor

boegel commented Nov 29, 2023

function find_host_ldconfig () {
    if [ ! -z ${EESSI_HOST_LDCONFIG} ]; then
        echo ${EESSI_HOST_LDCONFIG}
    else
        if [ -f /sbin/ldconfig ]; then
            echo /u/bin/ldconfig
        elif [ -f /usr/sbin/ldconfig ]; then
            echo /usr/sbin/ldconfig
        else
             echo "This is weird, you should set ${EESSI_HOST_LDCONFIG} (and a support issue)" >&2
             exit 1
        fi
    fi
}

on system without CUDA:

$ nvidia_smi_out=$(nvidia-smi --query-gpu=driver_version --format=csv,noheader 2> nvidia-smi.errors)
$ echo $?
127
$ cat nvidia-smi.errors
-bash: nvidia-smi: command not found

@casparvl
Copy link
Collaborator

I'm in the EESSI container and ran everything in #381 (comment) except the nvidia-smi. That still gives me:

 $ nvidia-smi
NVIDIA-SMI couldn't find libnvidia-ml.so library in your system. Please make sure that the NVIDIA Display Driver is properly installed and present in your system.
Please also try adding directory that contains libnvidia-ml.so to your system PATH.

I now have:

[EESSI pilot 2023.06] $ ls -al /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/latest
lrwxrwxrwx 1 casparl casparl 4 Nov 29 15:06 /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/latest -> host
[EESSI pilot 2023.06] $ ls -al /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/latest/
total 39
drwxr-x--- 2 casparl casparl  4096 Nov 29 15:06 .
drwxr-x--- 3 casparl casparl  4096 Nov 29 15:06 ..
lrwxrwxrwx 1 casparl casparl    30 Nov 29 15:06 libEGL.so -> /.singularity.d/libs/libEGL.so
lrwxrwxrwx 1 casparl casparl    32 Nov 29 15:06 libEGL.so.1 -> /.singularity.d/libs/libEGL.so.1
lrwxrwxrwx 1 casparl casparl    39 Nov 29 15:06 libEGL_nvidia.so.0 -> /.singularity.d/libs/libEGL_nvidia.so.0
lrwxrwxrwx 1 casparl casparl    29 Nov 29 15:06 libGL.so -> /.singularity.d/libs/libGL.so
lrwxrwxrwx 1 casparl casparl    31 Nov 29 15:06 libGL.so.1 -> /.singularity.d/libs/libGL.so.1
lrwxrwxrwx 1 casparl casparl    36 Nov 29 15:06 libGLESv1_CM.so -> /.singularity.d/libs/libGLESv1_CM.so
lrwxrwxrwx 1 casparl casparl    38 Nov 29 15:06 libGLESv1_CM.so.1 -> /.singularity.d/libs/libGLESv1_CM.so.1
lrwxrwxrwx 1 casparl casparl    45 Nov 29 15:06 libGLESv1_CM_nvidia.so.1 -> /.singularity.d/libs/libGLESv1_CM_nvidia.so.1
lrwxrwxrwx 1 casparl casparl    33 Nov 29 15:06 libGLESv2.so -> /.singularity.d/libs/libGLESv2.so
lrwxrwxrwx 1 casparl casparl    35 Nov 29 15:06 libGLESv2.so.2 -> /.singularity.d/libs/libGLESv2.so.2
lrwxrwxrwx 1 casparl casparl    42 Nov 29 15:06 libGLESv2_nvidia.so.2 -> /.singularity.d/libs/libGLESv2_nvidia.so.2
lrwxrwxrwx 1 casparl casparl    30 Nov 29 15:06 libGLX.so -> /.singularity.d/libs/libGLX.so
lrwxrwxrwx 1 casparl casparl    32 Nov 29 15:06 libGLX.so.0 -> /.singularity.d/libs/libGLX.so.0
lrwxrwxrwx 1 casparl casparl    39 Nov 29 15:06 libGLX_nvidia.so.0 -> /.singularity.d/libs/libGLX_nvidia.so.0
lrwxrwxrwx 1 casparl casparl    37 Nov 29 15:06 libGLdispatch.so -> /.singularity.d/libs/libGLdispatch.so
lrwxrwxrwx 1 casparl casparl    39 Nov 29 15:06 libGLdispatch.so.0 -> /.singularity.d/libs/libGLdispatch.so.0
lrwxrwxrwx 1 casparl casparl    35 Nov 29 15:06 libOpenCL.so.1 -> /.singularity.d/libs/libOpenCL.so.1
lrwxrwxrwx 1 casparl casparl    33 Nov 29 15:06 libOpenGL.so -> /.singularity.d/libs/libOpenGL.so
lrwxrwxrwx 1 casparl casparl    35 Nov 29 15:06 libOpenGL.so.0 -> /.singularity.d/libs/libOpenGL.so.0
lrwxrwxrwx 1 casparl casparl    31 Nov 29 15:06 libcuda.so -> /.singularity.d/libs/libcuda.so
lrwxrwxrwx 1 casparl casparl    33 Nov 29 15:06 libcuda.so.1 -> /.singularity.d/libs/libcuda.so.1
lrwxrwxrwx 1 casparl casparl    34 Nov 29 15:06 libnvcuvid.so -> /.singularity.d/libs/libnvcuvid.so
lrwxrwxrwx 1 casparl casparl    36 Nov 29 15:06 libnvcuvid.so.1 -> /.singularity.d/libs/libnvcuvid.so.1
lrwxrwxrwx 1 casparl casparl    37 Nov 29 15:06 libnvidia-cfg.so -> /.singularity.d/libs/libnvidia-cfg.so
lrwxrwxrwx 1 casparl casparl    39 Nov 29 15:06 libnvidia-cfg.so.1 -> /.singularity.d/libs/libnvidia-cfg.so.1
lrwxrwxrwx 1 casparl casparl    47 Nov 29 15:06 libnvidia-egl-wayland.so.1 -> /.singularity.d/libs/libnvidia-egl-wayland.so.1
lrwxrwxrwx 1 casparl casparl    52 Nov 29 15:06 libnvidia-eglcore.so.535.104.12 -> /.singularity.d/libs/libnvidia-eglcore.so.535.104.12
lrwxrwxrwx 1 casparl casparl    40 Nov 29 15:06 libnvidia-encode.so -> /.singularity.d/libs/libnvidia-encode.so
lrwxrwxrwx 1 casparl casparl    42 Nov 29 15:06 libnvidia-encode.so.1 -> /.singularity.d/libs/libnvidia-encode.so.1
lrwxrwxrwx 1 casparl casparl    37 Nov 29 15:06 libnvidia-fbc.so -> /.singularity.d/libs/libnvidia-fbc.so
lrwxrwxrwx 1 casparl casparl    39 Nov 29 15:06 libnvidia-fbc.so.1 -> /.singularity.d/libs/libnvidia-fbc.so.1
lrwxrwxrwx 1 casparl casparl    51 Nov 29 15:06 libnvidia-glcore.so.535.104.12 -> /.singularity.d/libs/libnvidia-glcore.so.535.104.12
lrwxrwxrwx 1 casparl casparl    49 Nov 29 15:06 libnvidia-glsi.so.535.104.12 -> /.singularity.d/libs/libnvidia-glsi.so.535.104.12
lrwxrwxrwx 1 casparl casparl    54 Nov 29 15:06 libnvidia-glvkspirv.so.535.104.12 -> /.singularity.d/libs/libnvidia-glvkspirv.so.535.104.12
lrwxrwxrwx 1 casparl casparl    49 Nov 29 15:06 libnvidia-gtk3.so.535.104.12 -> /.singularity.d/libs/libnvidia-gtk3.so.535.104.12
lrwxrwxrwx 1 casparl casparl    36 Nov 29 15:06 libnvidia-ml.so -> /.singularity.d/libs/libnvidia-ml.so
lrwxrwxrwx 1 casparl casparl    38 Nov 29 15:06 libnvidia-ml.so.1 -> /.singularity.d/libs/libnvidia-ml.so.1
lrwxrwxrwx 1 casparl casparl    42 Nov 29 15:06 libnvidia-opencl.so.1 -> /.singularity.d/libs/libnvidia-opencl.so.1
lrwxrwxrwx 1 casparl casparl    47 Nov 29 15:06 libnvidia-opticalflow.so.1 -> /.singularity.d/libs/libnvidia-opticalflow.so.1
lrwxrwxrwx 1 casparl casparl    48 Nov 29 15:06 libnvidia-ptxjitcompiler.so -> /.singularity.d/libs/libnvidia-ptxjitcompiler.so
lrwxrwxrwx 1 casparl casparl    50 Nov 29 15:06 libnvidia-ptxjitcompiler.so.1 -> /.singularity.d/libs/libnvidia-ptxjitcompiler.so.1
lrwxrwxrwx 1 casparl casparl    51 Nov 29 15:06 libnvidia-rtcore.so.535.104.12 -> /.singularity.d/libs/libnvidia-rtcore.so.535.104.12
lrwxrwxrwx 1 casparl casparl    48 Nov 29 15:06 libnvidia-tls.so.535.104.12 -> /.singularity.d/libs/libnvidia-tls.so.535.104.12
-rw-r----- 1 casparl casparl 10984 Nov 29 15:06 libs.txt
-rw-r----- 1 casparl casparl  1383 Nov 29 15:06 nvliblist.conf

@casparvl
Copy link
Collaborator

Note that

$ LD_LIBRARY_PATH=/cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/latest/:$LD_LIBRARY_PATH nvidia-smi
Wed Nov 29 15:15:16 2023
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.104.12             Driver Version: 535.104.12   CUDA Version: 12.2     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA A100-SXM4-40GB          On  | 00000000:31:00.0 Off |                  Off |
| N/A   29C    P0              50W / 400W |      4MiB / 40960MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   1  NVIDIA A100-SXM4-40GB          On  | 00000000:32:00.0 Off |                  Off |
| N/A   29C    P0              50W / 400W |      4MiB / 40960MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   2  NVIDIA A100-SXM4-40GB          On  | 00000000:CA:00.0 Off |                  Off |
| N/A   29C    P0              49W / 400W |      4MiB / 40960MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   3  NVIDIA A100-SXM4-40GB          On  | 00000000:E3:00.0 Off |                  Off |
| N/A   28C    P0              49W / 400W |      4MiB / 40960MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+

+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
|  No running processes found                                                           |
+---------------------------------------------------------------------------------------+

Works fine. I'm a bit confused: how is our runtime linker supposed to pick up on things from /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/latest? Shouldn't we do something to make it look there?

@ocaisa
Copy link
Member Author

ocaisa commented Nov 29, 2023

I updated the last line of the script in my previous comment, it should have been lib and not libs, extracting the relevant part:

# Make sure the libraries can be found by the EESSI linker
source /cvmfs/pilot.eessi-hpc.org/versions/2023.06/init/bash
host_injection_linker_dir=${EESSI_EPREFIX/versions/host_injections}
mkdir -p $host_injection_linker_dir
cd $host_injection_linker_dir
ln -s /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/latest lib 

$host_injection_linker_dir/lib is a trusted directory for our linker (so any libraries found in there are added to the defaults)

@casparvl
Copy link
Collaborator

Yeah, I used your code from after your last edit, so that isn't the issue here:

$ ls -al $host_injection_linker_dir/lib
lrwxrwxrwx 1 casparl casparl 56 Nov 29 15:07 /cvmfs/pilot.eessi-hpc.org/host_injections/2023.06/compat/linux/x86_64/lib -> /cvmfs/pilot.eessi-hpc.org/host_injections/nvidia/latest

$host_injection_linker_dir/lib is a trusted directory for our linker (so any libraries found in there are added to the defaults)

Where / how is that arranged? (then I can check if for me somehow that is not correct)

@ocaisa
Copy link
Member Author

ocaisa commented Nov 29, 2023

If you want to use nvidia-smi with our linker, you need to make a copy and set it's linker:

cp /usr/bin/nvidia-smi .
patchelf --set-interpreter $EESSI_EPREFIX/lib64/ld-linux-x86-64.so.2 ./nvidia-smi
./nvidia-smi

You don't have to do that though, you can just load the CUDA-Samples module and try to run deviceQuery

@casparvl
Copy link
Collaborator

casparvl commented Nov 29, 2023

Hm,

$ ld --verbose | grep SEARCH_DIR | tr -s ' ;' \\012
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/usr/x86_64-pc-linux-gnu/lib64")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/usr/lib64/binutils/x86_64-pc-linux-gnu/2.4064")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/usr/local/lib64")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/lib64")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/usr/lib64")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/usr/x86_64-pc-linux-gnu/lib")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/usr/lib64/binutils/x86_64-pc-linux-gnu/2.40")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/usr/local/lib")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/lib")
SEARCH_DIR("/cvmfs/pilot.eessi-hpc.org/versions/2023.06/compat/linux/x86_64/usr/lib")

and

[EESSI pilot 2023.06] $ echo $host_injection_linker_dir/lib
/cvmfs/pilot.eessi-hpc.org/host_injections/2023.06/compat/linux/x86_64/lib
$ echo $host_injection_linker_dir/lib
/cvmfs/pilot.eessi-hpc.org/host_injections/2023.06/compat/linux/x86_64/lib
[EESSI pilot 2023.06] $ ls $host_injection_linker_dir/lib
libEGL.so           libGLESv1_CM.so.1         libGLX.so.0         libOpenGL.so.0    libnvidia-cfg.so.1               libnvidia-fbc.so.1                 libnvidia-ml.so.1               libnvidia-tls.so.535.104.12
libEGL.so.1         libGLESv1_CM_nvidia.so.1  libGLX_nvidia.so.0  libcuda.so        libnvidia-egl-wayland.so.1       libnvidia-glcore.so.535.104.12     libnvidia-opencl.so.1           libs.txt
libEGL_nvidia.so.0  libGLESv2.so              libGLdispatch.so    libcuda.so.1      libnvidia-eglcore.so.535.104.12  libnvidia-glsi.so.535.104.12       libnvidia-opticalflow.so.1      nvliblist.conf
libGL.so            libGLESv2.so.2            libGLdispatch.so.0  libnvcuvid.so     libnvidia-encode.so              libnvidia-glvkspirv.so.535.104.12  libnvidia-ptxjitcompiler.so
libGL.so.1          libGLESv2_nvidia.so.2     libOpenCL.so.1      libnvcuvid.so.1   libnvidia-encode.so.1            libnvidia-gtk3.so.535.104.12       libnvidia-ptxjitcompiler.so.1
libGLESv1_CM.so     libGLX.so                 libOpenGL.so        libnvidia-cfg.so  libnvidia-fbc.so                 libnvidia-ml.so                    libnvidia-rtcore.so.535.104.12

Actually look perfectly fine. It seems to be on the search path for the linker. But... of course nvidia-smi isn't using the linker from the compat layer:

[EESSI pilot 2023.06] $ readelf -a $(which nvidia-smi) | grep interp
  [ 1] .interp           PROGBITS         0000000000400238  00000238
      [Requesting program interpreter: /lib64/ld-linux-x86-64.so.2]

I guess for software that is actually build in the software layer, this wouldn't be an option.

@casparvl
Copy link
Collaborator

If you want to use nvidia-smi with our linker, you need to make a copy and set it's linker:

cp /usr/bin/nvidia-smi .
patchelf --set-interpreter $EESSI_EPREFIX/lib64/ld-linux-x86-64.so.2 ./nvidia-smi
./nvidia-smi

You don't have to do that though, you can just load the CUDA-Samples module and try to run deviceQuery

Yes, I came to this conclusion as well. Unfortunately, my writeable overlay has disappeared (again), and with it all the contents. So no more deviceQuery available. This issue is really really annoying XD I'll try to resume the container, see if that brings it back without having to rebuild...

@casparvl
Copy link
Collaborator

Oh, btw, it does mean that if your script should work in the EESSI container (and we probably want that), you'll have to do that copy of nvidia-smi, patchelf the linker, before you run it with

nvidia-smi --query-gpu=driver_version --format=csv,noheader | tail -n1 > version.txt

@ocaisa
Copy link
Member Author

ocaisa commented Nov 29, 2023

No, not really, you can just do:

LD_LIBRARY_PATH=/.singularity/libs nvidia-smi --query-gpu=driver_version --format=csv,noheader | tail -n1 > version.txt

@casparvl
Copy link
Collaborator

casparvl commented Nov 29, 2023

True, and much easier / foolproof. I meant: just remember to include that support for running it in the container if you make the PR for that :)

Oh, and it seems it needs to end in ::

[EESSI pilot 2023.06] $ LD_LIBRARY_PATH=/.singularity/libs nvidia-smi --query-gpu=driver_version --format=csv,noheader
NVIDIA-SMI couldn't find libnvidia-ml.so library in your system. Please make sure that the NVIDIA Display Driver is properly installed and present in your system.
Please also try adding directory that contains libnvidia-ml.so to your system PATH.
[EESSI pilot 2023.06] $ LD_LIBRARY_PATH=/.singularity/libs: nvidia-smi --query-gpu=driver_version --format=csv,noheader
535.104.12
535.104.12
535.104.12
535.104.12

eb_hooks.py Show resolved Hide resolved
create_lmodrc.py Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
create_lmodrc.py Show resolved Hide resolved
create_lmodrc.py Outdated Show resolved Hide resolved
create_lmodrc.py Show resolved Hide resolved
Copy link
Collaborator

@casparvl casparvl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lgtm! But, hitting the annying rate limit again on the CI... We'll have to wait for a bit before we try again I guess :(

@ocaisa
Copy link
Member Author

ocaisa commented Dec 2, 2023

This actually needs to be built by the bot, but won't work without first using the script to install CUDA under host_injections

@ocaisa
Copy link
Member Author

ocaisa commented Dec 21, 2023

GPU support implemented with #434

@ocaisa ocaisa closed this Dec 21, 2023
TopRichard pushed a commit to TopRichard/bot-software-layer1 that referenced this pull request May 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants