Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to pull singularity image #482

Closed
l0ka opened this issue Oct 28, 2020 · 12 comments
Closed

Failed to pull singularity image #482

l0ka opened this issue Oct 28, 2020 · 12 comments
Labels
question Further information is requested

Comments

@l0ka
Copy link

l0ka commented Oct 28, 2020

Hi, I'm experiencing an error related to singularity images. I run the pipeline as described in #481
Here's the log:

executor >  pbs (4)
[98/20b915] process > INPUT_CHECK:SAMPLESHEET_CHE... [100%] 1 of 1, cached: 1 ✔
[6d/35856c] process > CAT_FASTQ (tumor_R1)           [100%] 1 of 1, cached: 1 ✔
[ca/abf284] process > FASTQC_UMITOOLS_TRIMGALORE:... [100%] 1 of 1, cached: 1 ✔
[0f/3f31b3] process > FASTQC_UMITOOLS_TRIMGALORE:... [100%] 1 of 1, cached: 1 ✔
[24/9fa24a] process > SORTMERNA (tumor_R1)           [100%] 1 of 1, cached: 1 ✔
[6f/4b3d46] process > ALIGN_STAR:STAR_ALIGN (tumo... [100%] 1 of 1, cached: 1 ✔
[64/800680] process > ALIGN_STAR:BAM_SORT_SAMTOOL... [100%] 1 of 1, cached: 1 ✔
[e4/9c4a76] process > ALIGN_STAR:BAM_SORT_SAMTOOL... [100%] 1 of 1, cached: 1 ✔
[d0/65f8df] process > ALIGN_STAR:BAM_SORT_SAMTOOL... [100%] 1 of 1, cached: 1 ✔
[83/c5e2e3] process > ALIGN_STAR:BAM_SORT_SAMTOOL... [100%] 1 of 1, cached: 1 ✔
[ee/87f4d3] process > ALIGN_STAR:BAM_SORT_SAMTOOL... [100%] 1 of 1, cached: 1 ✔
[-        ] process > MULTIQC_CUSTOM_FAIL_MAPPED     -
[-        ] process > PRESEQ_LCEXTRAP (tumor_R1)     -
[47/fd4a28] process > MARK_DUPLICATES_PICARD:PICA... [100%] 1 of 1, cached: 1 ✔
[-        ] process > MARK_DUPLICATES_PICARD:SAMT... -
[-        ] process > MARK_DUPLICATES_PICARD:BAM_... -
[-        ] process > MARK_DUPLICATES_PICARD:BAM_... -
[-        ] process > MARK_DUPLICATES_PICARD:BAM_... -
[-        ] process > STRINGTIE                      -
[-        ] process > SUBREAD_FEATURECOUNTS (tumo... -
[-        ] process > FEATURECOUNTS_MERGE_COUNTS     -
[-        ] process > DESEQ2_QC_FEATURECOUNTS        -
[-        ] process > SUBREAD_FEATURECOUNTS_BIOTY... -
[-        ] process > MULTIQC_CUSTOM_BIOTYPE         -
[98/bd1c69] process > GET_CHROM_SIZES (genome.fa)    [100%] 1 of 1, cached: 1 ✔
[-        ] process > BEDTOOLS_GENOMECOV             -
[-        ] process > UCSC_BEDRAPHTOBIGWIG           -
[-        ] process > QUALIMAP_RNASEQ                -
[-        ] process > DUPRADAR                       -
[-        ] process > RSEQC:RSEQC_BAMSTAT            -
[-        ] process > RSEQC:RSEQC_INNERDISTANCE      -
[-        ] process > RSEQC:RSEQC_INFEREXPERIMENT    -
[-        ] process > RSEQC:RSEQC_JUNCTIONANNOTATION -
[-        ] process > RSEQC:RSEQC_JUNCTIONSATURATION -
[-        ] process > RSEQC:RSEQC_READDISTRIBUTION   -
[-        ] process > RSEQC:RSEQC_READDUPLICATION    -
[-        ] process > MULTIQC_CUSTOM_STRAND_CHECK    -
[9d/5ed2e5] process > QUANTIFY_SALMON:SALMON_QUAN... [100%] 1 of 1, cached: 1 ✔
[82/3d44d9] process > QUANTIFY_SALMON:SALMON_TX2G... [100%] 1 of 1, cached: 1 ✔
[e6/b6f1aa] process > QUANTIFY_SALMON:SALMON_TXIM... [100%] 1 of 1, cached: 1 ✔
[dd/716235] process > QUANTIFY_SALMON:SALMON_MERG... [100%] 1 of 1, cached: 1 ✔
[a0/ee9f13] process > QUANTIFY_SALMON:SALMON_SE_G... [100%] 1 of 1, cached: 1 ✔
[54/61202a] process > QUANTIFY_SALMON:SALMON_SE_T... [100%] 1 of 1, cached: 1 ✔
[f7/b599e6] process > DESEQ2_QC_SALMON               [100%] 1 of 1, cached: 1 ✔
[-        ] process > GET_SOFTWARE_VERSIONS          -
[-        ] process > MULTIQC                        -
-[nf-core/rnaseq] [PASS] STAR 5% mapped threshold: 55.72% - tumor_R1.
Pulling Singularity image docker://quay.io/biocontainers/rseqc:3.0.1--py37h516909a_1 [cache dbs/singularity_cache/quay.io-biocontainers-rseqc-3.0.1--py37h516909a_1.img]
Pulling Singularity image docker://quay.io/biocontainers/bedtools:2.29.2--hc088bd4_0 [cache dbs/singularity_cache/quay.io-biocontainers-bedtools-2.29.2--hc088bd4_0.img]
Pulling Singularity image docker://quay.io/biocontainers/stringtie:2.1.4--h7e0af3c_0 [cache dbs/singularity_cache/quay.io-biocontainers-stringtie-2.1.4--h7e0af3c_0.img]
Pulling Singularity image docker://quay.io/biocontainers/bioconductor-dupradar:1.18.0--r40_1 [cache dbs/singularity_cache/quay.io-biocontainers-bioconductor-dupradar-1.18.0--r40_1.img]
Pulling Singularity image docker://quay.io/biocontainers/qualimap:2.2.2d--1 [cache dbs/singularity_cache/quay.io-biocontainers-qualimap-2.2.2d--1.img]
Error executing process > 'BEDTOOLS_GENOMECOV (tumor_R1)'

Caused by:
  Failed to pull singularity image
  command: singularity pull  --name quay.io-biocontainers-bedtools-2.29.2--hc088bd4_0.img docker://quay.io/biocontainers/bedtools:2.29.2--hc088bd4_0 > /dev/null
  status : 255
  message:
    INFO:    Converting OCI blobs to SIF format
    FATAL:   While making image from oci registry: error fetching image to cache: while building SIF from layers: unable to create new build: while ensuring correct compression algorithm: while reading test squashfs: open /scratch/27815062.moab/bundle-temp-868906504/squashfs-gzip-comp-test-098449530: no such file or directory

I saw this issue happening almost any run I tried.
Thanks in advance!

@drpatelh
Copy link
Member

Hi @l0ka it may be because your home directory is full. There should be a ~/.singularity directory there that could be filling up. Can you try and delete that and try again? It may be easier to respond to these failures on the #rnaseq channel on the nf-core Slack workspace.

@drpatelh drpatelh added the question Further information is requested label Oct 28, 2020
@AhmedMohamed1993
Copy link

Hi @drpatelh, This worked out for me when I moved into a directory with more storage, Thanks.

@drpatelh
Copy link
Member

drpatelh commented Nov 5, 2020

Great thanks for letting me know @AhmedMohamed1993! I think that is the solution here so will close 👍

@drpatelh drpatelh closed this as completed Nov 5, 2020
@l0ka
Copy link
Author

l0ka commented Nov 5, 2020

Hi @drpatelh, sorry for the late reply, I was very busy these days.
Anyway, I'm still experiencing this issue sometimes. As you can see from #481 I set the cacheDir parameter (specifically it's a folder outside my /home directory, in the HPC cluster storage and there are plenty of PBs available) and all pipeline runs are performed outside /home. Additionally, if I rerun the pipeline using the -resume flag sometimes it works fine and sometimes it fails with the very same error.
In the next days I'm planning to analyze a lot of samples, so I will have more precise details and I'll let you know.
Thank you!

@drpatelh
Copy link
Member

drpatelh commented Nov 5, 2020

Hi @l0ka . No worries 🙂 So the cacheDir parameter controls where the final images are stored but if you are downloading Docker containers and using -profile singularity then they need to be converted to Singularity images. This creates a bunch of temp files in ~/.singularity that eventually fill up your home directory and start generating errors like you reported above. The easiest solution is to soft-link that directory to somewhere where you have enough space.

It's odd though that you say it's an intermittent issue...in theory you shouldn't be able to download any more images if your home directory is full. Anyway, I definitely think this is a local issue.

@l0ka
Copy link
Author

l0ka commented Nov 5, 2020

The ~/.singularity directory is already symlinked outside /home, I'm sorry I forgot to mention it before. That's also why I found this thing odd. Since you're sure it cannot be a "pipeline-related" issue, the only one thing I can think about is some problems with the storage/folders. I'll check and keep you posted if I run into this issue again.

@l0ka
Copy link
Author

l0ka commented Nov 9, 2020

Hi @drpatelh I finally had time to inspect the results. I run the pipeline on 33 samples. I submit a job (pbs system) for each sample, to have single log files. Out of 33 runs, 18 were ok and 15 failed. Among the 15 failed runs:

  • 1 failed because a fastq is corrupted;
  • 1 failed with this message:
    Unknown error accessing project nf-core/rnaseq -- Repository may be corrupted: /home/people/aluloc/.nextflow/assets/nf-core/rnaseq
  • 1 failed because exceeded walltime limit;
  • 1 failed because of PICARD_MARKDUPLICATES:
Error executing process > 'RNASEQ:MARK_DUPLICATES_PICARD:PICARD_MARKDUPLICATES (tumor_R1)'

Caused by:
  Process `RNASEQ:MARK_DUPLICATES_PICARD:PICARD_MARKDUPLICATES (tumor_R1)` terminated with an error exit status (1)

Command executed:

  picard \
      -Xmx36g \
      MarkDuplicates \
      ASSUME_SORTED=true REMOVE_DUPLICATES=false VALIDATION_STRINGENCY=LENIENT TMP_DIR=tmp \
      INPUT=tumor_R1.sorted.bam \
      OUTPUT=tumor_R1.markdup.sorted.bam \
      METRICS_FILE=tumor_R1.markdup.sorted.MarkDuplicates.metrics.txt
  
  echo $(picard MarkDuplicates --version 2>&1) | grep -o 'Version:.*' | cut -f2- -d: > picard.version.txt

Command exit status:
  1

Command output:
  (empty)

Command error:
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:17,788
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	Large duplicate set. size = 1244
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:29,767
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	Large duplicate set. size = 1957
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:16,387
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	Large duplicate set. size = 1350
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:7,075
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	Large duplicate set. size = 1264
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:28,429
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	Large duplicate set. size = 1197
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:20,654
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	Large duplicate set. size = 1313
  INFO	2020-11-06 14:12:37	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:18,069
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	Large duplicate set. size = 1294
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:14,687
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	Large duplicate set. size = 1021
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:1,642
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	Large duplicate set. size = 1011
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:14,705
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	Large duplicate set. size = 1404
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:28,791
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	Large duplicate set. size = 1556
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:21,766
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	Large duplicate set. size = 1027
  INFO	2020-11-06 14:12:38	OpticalDuplicateFinder	compared         1,000 ReadEnds to others.  Elapsed time: 00:00:00s.  Time for last 1,000:    0s.  Last read position: 0:14,308
  INFO	2020-11-06 14:12:40	MarkDuplicates	Traversing fragment information and detecting duplicates.
  INFO	2020-11-06 14:12:46	MarkDuplicates	Sorting list of duplicate records.
  INFO	2020-11-06 14:12:49	MarkDuplicates	After generateDuplicateIndexes freeMemory: 18014140976; totalMemory: 27699183616; maxMemory: 38654705664
  INFO	2020-11-06 14:12:49	MarkDuplicates	Marking 54115970 records as duplicates.
  INFO	2020-11-06 14:12:49	MarkDuplicates	Found 150959 optical duplicate clusters.
  INFO	2020-11-06 14:12:49	MarkDuplicates	Reads are assumed to be ordered by: coordinate
  INFO	2020-11-06 14:14:20	MarkDuplicates	Written    10,000,000 records.  Elapsed time: 00:01:30s.  Time for last 10,000,000:   90s.  Last read position: 2:216,299,484
  INFO	2020-11-06 14:15:52	MarkDuplicates	Written    20,000,000 records.  Elapsed time: 00:03:02s.  Time for last 10,000,000:   92s.  Last read position: 6:32,132,427
  INFO	2020-11-06 14:17:20	MarkDuplicates	Written    30,000,000 records.  Elapsed time: 00:04:31s.  Time for last 10,000,000:   88s.  Last read position: 9:9,442,111
  .command.run: line 38: /dev/fd/62: No such file or directory
  INFO	2020-11-06 14:18:49	MarkDuplicates	Written    40,000,000 records.  Elapsed time: 00:05:59s.  Time for last 10,000,000:   88s.  Last read position: 12:94,962,998
  INFO	2020-11-06 14:20:06	MarkDuplicates	Written    50,000,000 records.  Elapsed time: 00:07:16s.  Time for last 10,000,000:   77s.  Last read position: 14:50,053,433
  INFO	2020-11-06 14:21:25	MarkDuplicates	Written    60,000,000 records.  Elapsed time: 00:08:35s.  Time for last 10,000,000:   78s.  Last read position: 14:50,320,475
  INFO	2020-11-06 14:22:47	MarkDuplicates	Written    70,000,000 records.  Elapsed time: 00:09:57s.  Time for last 10,000,000:   81s.  Last read position: 16:84,600,295
  INFO	2020-11-06 14:24:10	MarkDuplicates	Written    80,000,000 records.  Elapsed time: 00:11:20s.  Time for last 10,000,000:   83s.  Last read position: 22:38,879,746
  INFO	2020-11-06 14:24:43	MarkDuplicates	Writing complete. Closing input iterator.
  INFO	2020-11-06 14:24:43	MarkDuplicates	Duplicate Index cleanup.
  INFO	2020-11-06 14:24:43	MarkDuplicates	Getting Memory Stats.
  INFO	2020-11-06 14:24:44	MarkDuplicates	Before output close freeMemory: 132199224; totalMemory: 142606336; maxMemory: 38654705664
  INFO	2020-11-06 14:24:45	MarkDuplicates	Closed outputs. Getting more Memory Stats.
  INFO	2020-11-06 14:24:45	MarkDuplicates	After output close freeMemory: 49317928; totalMemory: 58720256; maxMemory: 38654705664
  [Fri Nov 06 14:24:45 GMT 2020] picard.sam.markduplicates.MarkDuplicates done. Elapsed time: 17.59 minutes.
  Runtime.totalMemory()=58720256
  .command.run: line 166: kill: (18008) - No such process
  INFO:    Cleaning up image...
  • 1 failed because of QUALIMAP_RNASEQ:
Error executing process > 'RNASEQ:QUALIMAP_RNASEQ (tumor_R1)'

Caused by:
  Process `RNASEQ:QUALIMAP_RNASEQ (tumor_R1)` terminated with an error exit status (1)

Command executed:

  unset DISPLAY
  mkdir tmp
  export _JAVA_OPTIONS=-Djava.io.tmpdir=./tmp
  qualimap \
      --java-mem-size=36G \
      rnaseq \
       \
      -bam tumor_R1.markdup.sorted.bam \
      -gtf genes.gtf \
      -p strand-specific-reverse \
      -pe \
      -outdir tumor_R1
  
  echo $(qualimap 2>&1) | sed 's/^.*QualiMap v.//; s/Built.*$//' > qualimap.version.txt

Command exit status:
  1

Command output:
  (empty)

Command error:
  INFO:    Convert SIF file to sandbox...
  ERROR  : Failed to create user namespace: user namespace disabled
  • 3 failed because of TRIMGALORE:
Error executing process > 'RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:TRIMGALORE (tumor_R1)'

Caused by:
  Process `RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:TRIMGALORE (tumor_R1)` terminated with an error exit status (1)

Command executed:

  [ ! -f  tumor_R1_1.fastq.gz ] && ln -s tumor_R1_1.merged.fastq.gz tumor_R1_1.fastq.gz
  [ ! -f  tumor_R1_2.fastq.gz ] && ln -s tumor_R1_2.merged.fastq.gz tumor_R1_2.fastq.gz
  trim_galore \
      --fastqc \
      --cores 4 \
      --paired \
      --gzip \
       \
       \
       \
       \
      tumor_R1_1.fastq.gz \
      tumor_R1_2.fastq.gz
  echo $(trim_galore --version 2>&1) | sed 's/^.*version //; s/Last.*$//' > trimgalore.version.txt

Command exit status:
  1

Command output:
  (empty)

Command error:
  126	58	0.8	1	18 40
  127	59	0.8	1	19 40
  128	61	0.8	1	21 40
  129	104	0.8	1	21 83
  130	315	0.8	1	110 205
  131	125	0.8	1	34 91
  132	1452	0.8	1	458 994
  133	86	0.8	1	20 66
  134	37	0.8	1	10 27
  135	36	0.8	1	12 24
  136	19	0.8	1	3 16
  137	26	0.8	1	4 22
  138	26	0.8	1	3 23
  139	31	0.8	1	1 30
  140	15	0.8	1	2 13
  141	20	0.8	1	1 19
  142	14	0.8	1	3 11
  143	18	0.8	1	3 15
  144	20	0.8	1	5 15
  145	16	0.8	1	2 14
  146	36	0.8	1	0 36
  147	19	0.8	1	7 12
  148	51	0.8	1	4 47
  149	70	0.8	1	7 63
  150	304	0.8	1	11 293
  151	4321	0.8	1	76 4245
  
  RUN STATISTICS FOR INPUT FILE: tumor_R1_2.fastq.gz
  =============================================
  50998158 sequences processed in total
  The length threshold of paired-end sequences gets evaluated later on (in the validation step)
  
  Validate paired-end files tumor_R1_1_trimmed.fq.gz and tumor_R1_2_trimmed.fq.gz
  file_1: tumor_R1_1_trimmed.fq.gz, file_2: tumor_R1_2_trimmed.fq.gz
  
  pigz: error while loading shared libraries: libz.so.1: cannot open shared object file: No such file or directory
  
  >>>>> Now validing the length of the 2 paired-end infiles: tumor_R1_1_trimmed.fq.gz and tumor_R1_2_trimmed.fq.gz <<<<<
  pigz: error while loading shared libraries: libz.so.1: cannot open shared object file: No such file or directory
  Writing validated paired-end Read 1 reads to tumor_R1_1_val_1.fq.gz
  Writing validated paired-end Read 2 reads to tumor_R1_2_val_2.fq.gz
  
  Total number of sequences analysed: 0
  
  Number of sequence pairs removed because at least one read was shorter than the length cutoff (20 bp): 0 (N/A%)
  pigz: error while loading shared libraries: libz.so.1: cannot open shared object file: No such file or directory
  Died at /usr/local/bin/trim_galore line 2009.
  pigz: error while loading shared libraries: libz.so.1: cannot open shared object file: No such file or directory
  .command.run: line 166: kill: (1962) - No such process
  INFO:    Cleaning up image...
  • 7 failed because of TRIMGALORE_FASTQC:
Error executing process > 'RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:FASTQC (tumor_R1)'

Caused by:
  Process `RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:FASTQC (tumor_R1)` terminated with an error exit status (1)

Command executed:

  [ ! -f  tumor_R1_1.fastq.gz ] && ln -s tumor_R1_1.merged.fastq.gz tumor_R1_1.fastq.gz
  [ ! -f  tumor_R1_2.fastq.gz ] && ln -s tumor_R1_2.merged.fastq.gz tumor_R1_2.fastq.gz
  fastqc --quiet --threads 6 tumor_R1_1.fastq.gz tumor_R1_2.fastq.gz
  fastqc --version | sed -e "s/FastQC v//g" > fastqc.version.txt

Command exit status:
  1

Command output:
  (empty)

Command error:
  INFO:    Convert SIF file to sandbox...
  ERROR  : Failed to create user namespace: user namespace disabled

All jobs were launched at the very same time, allocating the same resources (CPUs, RAM, walltime) and using the same config file. Notably, for some of the failed samples I previously obtained successful runs, when I was testing the pipeline (using the very same code, config, options, ecc...).
I'll try to rerun the pipeline, using the -resume option for the failed samples and see what happen.

@l0ka
Copy link
Author

l0ka commented Nov 9, 2020

Update: rerunning the pipeline using the -resume option caused the abortion of the run, exiting with the same error messages as before. Now I completely cleared the ~/.singularity cache folder (it was "only" 611MB and it is symlinked to another location, so definitely no problems of space) and launch the pipeline from the beginning and it seems to be running fine.

@drpatelh
Copy link
Member

drpatelh commented Nov 9, 2020

Yeah, sorry @l0ka. I tried to wipe my cache and start again and I had all sorts of intermittent issues where the pipeline failed to pull different containers... 😏 I thought the container hosting etc would be much more stable than this but we will just have to deal with this on case-by-case basis. I am going to add some docs to the main README of the pipeline to suggest using NXF_SINGULARITY_CACHEDIR or singularity.cacheDir when running with Singularity. At least this means that you only need to go through the hassle of pulling these containers once and then you can re-use them for future runs. Thank you for all of your feedback!

@drpatelh
Copy link
Member

drpatelh commented Nov 9, 2020

More docs added in #490

@l0ka
Copy link
Author

l0ka commented Nov 9, 2020

Ok, cool, now I'm relieved 😌 At least we know what to do when it fails to pull containers.
Thanks for the support and good luck with the new release!! Can't wait to try it 😏

@wushyer
Copy link

wushyer commented Nov 19, 2021

Hi I met this problem but no use when I delete the singularity dir. Could you check it ? Thanks

N E X T F L O W ~ version 21.10.1
Launching rnaseq/main.nf [elated_ramanujan] - revision: bb0fa33a13


                                    ,--./,-.
    ___     __   __   __   ___     /,-._.--~'

|\ | |__ __ / / \ |__) |__ } { | \| | \__, \__/ | \ |___ \-.,--, .,._,'
nf-core/rnaseq v3.4

Core Nextflow options
runName : elated_ramanujan
containerEngine : singularity
container : nfcore-rnaseq.simg
launchDir : /scratch-cbe/users/shuangyang.wu/H1/rnaseq
workDir : /scratch-cbe/users/shuangyang.wu/H1/rnaseq/work
projectDir : /scratch-cbe/users/shuangyang.wu/H1/rnaseq/rnaseq
userName : shuangyang.wu
profile : singularity
configFiles : /scratch-cbe/users/shuangyang.wu/H1/rnaseq/rnaseq/nextflow.config

Input/output options
input : sample.csv

Reference genome options
genome : TAIR10
fasta : s3://ngi-igenomes/igenomes/Arabidopsis_thaliana/Ensembl/TAIR10/Sequence/WholeGenomeFasta/genome.fa
gtf : s3://ngi-igenomes/igenomes/Arabidopsis_thaliana/Ensembl/TAIR10/Annotation/Genes/genes.gtf
gene_bed : s3://ngi-igenomes/igenomes/Arabidopsis_thaliana/Ensembl/TAIR10/Annotation/Genes/genes.bed
star_index : s3://ngi-igenomes/igenomes/Arabidopsis_thaliana/Ensembl/TAIR10/Sequence/STARIndex/

Alignment options
skip_markduplicates: true

!! Only displaying parameters that differ from the pipeline defaults !!

If you use nf-core/rnaseq for your analysis please cite:


[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GTF_GENE_FILTER -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:RSEM_PREPAREREFERENCE_TRAN... -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GET_CHROM_SIZES -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GTF_GENE_FILTER -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:RSEM_PREPAREREFERENCE_TRAN... -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GET_CHROM_SIZES [ 0%] 0 of 1
executor > local (1)
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GTF_GENE_FILTER -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:RSEM_PREPAREREFERENCE_TRAN... -
executor > local (1)
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GTF_GENE_FILTER -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:RSEM_PREPAREREFERENCE_TRAN... -
[9a/ec55f5] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GET_CHROM_SIZES (genome.fa) [ 0%] 0 of 1
[- ] process > NFCORE_RNASEQ:RNASEQ:INPUT_CHECK:SAMPLESHEET_CHECK -
[- ] process > NFCORE_RNASEQ:RNASEQ:CAT_FASTQ -
[- ] process > NFCORE_RNASEQ:RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:FASTQC -
[- ] process > NFCORE_RNASEQ:RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:TRIMGALORE -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:STAR_ALIGN -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:SAMTOOLS_SORT -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:SAMTOOLS_INDEX -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_QUANT -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_TX2GENE -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_TXIMPORT -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE_LENGT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE_SCALED -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_TRANSCRIPT -
[- ] process > NFCORE_RNASEQ:RNASEQ:DESEQ2_QC_STAR_SALMON -
[- ] process > NFCORE_RNASEQ:RNASEQ:MULTIQC_TSV_FAIL_MAPPED -
[- ] process > NFCORE_RNASEQ:RNASEQ:PRESEQ_LCEXTRAP -
[- ] process > NFCORE_RNASEQ:RNASEQ:STRINGTIE -
[- ] process > NFCORE_RNASEQ:RNASEQ:SUBREAD_FEATURECOUNTS -
[- ] process > NFCORE_RNASEQ:RNASEQ:MULTIQC_CUSTOM_BIOTYPE -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDTOOLS_GENOMECOV -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_FORWARD:UCSC_BEDCLIP -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_FORWARD:UCSC_BEDGRAPHT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_REVERSE:UCSC_BEDCLIP -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_REVERSE:UCSC_BEDGRAPHT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUALIMAP_RNASEQ -
executor > local (1)
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GTF_GENE_FILTER -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:RSEM_PREPAREREFERENCE_TRAN... -
[9a/ec55f5] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GET_CHROM_SIZES (genome.fa) [100%] 1 of 1, failed: 1 ✘
[- ] process > NFCORE_RNASEQ:RNASEQ:INPUT_CHECK:SAMPLESHEET_CHECK -
[- ] process > NFCORE_RNASEQ:RNASEQ:CAT_FASTQ -
[- ] process > NFCORE_RNASEQ:RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:FASTQC -
[- ] process > NFCORE_RNASEQ:RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:TRIMGALORE -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:STAR_ALIGN -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:SAMTOOLS_SORT -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:SAMTOOLS_INDEX -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_QUANT -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_TX2GENE -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_TXIMPORT -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE_LENGT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE_SCALED -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_TRANSCRIPT -
[- ] process > NFCORE_RNASEQ:RNASEQ:DESEQ2_QC_STAR_SALMON -
[- ] process > NFCORE_RNASEQ:RNASEQ:MULTIQC_TSV_FAIL_MAPPED -
[- ] process > NFCORE_RNASEQ:RNASEQ:PRESEQ_LCEXTRAP -
[- ] process > NFCORE_RNASEQ:RNASEQ:STRINGTIE -
[- ] process > NFCORE_RNASEQ:RNASEQ:SUBREAD_FEATURECOUNTS -
[- ] process > NFCORE_RNASEQ:RNASEQ:MULTIQC_CUSTOM_BIOTYPE -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDTOOLS_GENOMECOV -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_FORWARD:UCSC_BEDCLIP -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_FORWARD:UCSC_BEDGRAPHT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_REVERSE:UCSC_BEDCLIP -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_REVERSE:UCSC_BEDGRAPHT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUALIMAP_RNASEQ -
[- ] process > NFCORE_RNASEQ:RNASEQ:DUPRADAR -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_BAMSTAT -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_INNERDISTANCE -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_INFEREXPERIMENT -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_JUNCTIONANNOTATION -
executor > local (1)
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GTF_GENE_FILTER -
[- ] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:RSEM_PREPAREREFERENCE_TRAN... -
[9a/ec55f5] process > NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GET_CHROM_SIZES (genome.fa) [100%] 1 of 1, failed: 1 ✘
[- ] process > NFCORE_RNASEQ:RNASEQ:INPUT_CHECK:SAMPLESHEET_CHECK -
[- ] process > NFCORE_RNASEQ:RNASEQ:CAT_FASTQ -
[- ] process > NFCORE_RNASEQ:RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:FASTQC -
[- ] process > NFCORE_RNASEQ:RNASEQ:FASTQC_UMITOOLS_TRIMGALORE:TRIMGALORE -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:STAR_ALIGN -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:SAMTOOLS_SORT -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:SAMTOOLS_INDEX -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:ALIGN_STAR:BAM_SORT_SAMTOOLS:BAM_STATS_SA... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_QUANT -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_TX2GENE -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_TXIMPORT -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE_LENGT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_GENE_SCALED -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUANTIFY_STAR_SALMON:SALMON_SE_TRANSCRIPT -
[- ] process > NFCORE_RNASEQ:RNASEQ:DESEQ2_QC_STAR_SALMON -
[- ] process > NFCORE_RNASEQ:RNASEQ:MULTIQC_TSV_FAIL_MAPPED -
[- ] process > NFCORE_RNASEQ:RNASEQ:PRESEQ_LCEXTRAP -
[- ] process > NFCORE_RNASEQ:RNASEQ:STRINGTIE -
[- ] process > NFCORE_RNASEQ:RNASEQ:SUBREAD_FEATURECOUNTS -
[- ] process > NFCORE_RNASEQ:RNASEQ:MULTIQC_CUSTOM_BIOTYPE -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDTOOLS_GENOMECOV -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_FORWARD:UCSC_BEDCLIP -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_FORWARD:UCSC_BEDGRAPHT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_REVERSE:UCSC_BEDCLIP -
[- ] process > NFCORE_RNASEQ:RNASEQ:BEDGRAPH_TO_BIGWIG_REVERSE:UCSC_BEDGRAPHT... -
[- ] process > NFCORE_RNASEQ:RNASEQ:QUALIMAP_RNASEQ -
[- ] process > NFCORE_RNASEQ:RNASEQ:DUPRADAR -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_BAMSTAT -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_INNERDISTANCE -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_INFEREXPERIMENT -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_JUNCTIONANNOTATION -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_JUNCTIONSATURATION -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_READDISTRIBUTION -
[- ] process > NFCORE_RNASEQ:RNASEQ:RSEQC:RSEQC_READDUPLICATION -
[- ] process > NFCORE_RNASEQ:RNASEQ:MULTIQC_TSV_STRAND_CHECK -
[- ] process > NFCORE_RNASEQ:RNASEQ:CUSTOM_DUMPSOFTWAREVERSIONS -
[- ] process > NFCORE_RNASEQ:RNASEQ:MULTIQC -
Pulling Singularity image https://depot.galaxyproject.org/singularity/python:3.8.3 [cache /scratch-cbe/users/shuangyang.wu/H1/rnaseq/work/singularity/depot.galaxyproject.org-singularity-python-3.8.3.img]
Execution cancelled -- Finishing pending tasks before exit
WARN: Singularity cache directory has not been defined -- Remote image will be stored in the path: /scratch-cbe/users/shuangyang.wu/H1/rnaseq/work/singularity -- Use env variable NXF_SINGULARITY_CACHEDIR to specify a different location
Error executing process > 'NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GET_CHROM_SIZES (genome.fa)'

Caused by:
Process NFCORE_RNASEQ:RNASEQ:PREPARE_GENOME:GET_CHROM_SIZES (genome.fa) terminated with an error exit status (255)

Command executed:

samtools
faidx
genome.fa

cut -f 1,2 genome.fa.fai > genome.fa.sizes

cat <<-END_VERSIONS > versions.yml
GET_CHROM_SIZES:
samtools: $(echo $(samtools --version 2>&1) | sed 's/^.samtools //; s/Using.$//')
END_VERSIONS

Command exit status:
255

Command output:
(empty)

Command error:
INFO: Converting SIF file to temporary sandbox...
FATAL: while extracting /scratch-cbe/users/shuangyang.wu/H1/rnaseq/work/singularity/depot.galaxyproject.org-singularity-samtools-1.10--h9402c20_2.img: root filesystem extraction failed: extract command failed: ERROR : Failed to create user namespace: user namespace disabled
: exit status 1

Work dir:
/scratch-cbe/users/shuangyang.wu/H1/rnaseq/work/9a/ec55f5af0ff3ff2e8e2cc1803bf51e

Tip: you can replicate the issue by changing to the process work dir and entering the command bash .command.run

Unexpected error [AbortedException]

-- Check script 'rnaseq/./workflows/rnaseq.nf' at line: 603 or see '.nextflow.log' file for more details

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants