-
Notifications
You must be signed in to change notification settings - Fork 513
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
STARsolo 2.7.3a and 2.7.4a segmentation fault (core dumped). #936
Comments
Hi James, this looks like a bug. Could you run the same example without the Cheers |
Hi Alex,
Thanks for getting back to me. Sure I will do that this weekend! I branched an older commit of my pipeline for which I know STAR runs at and it worked fine again with both Hg38 and Hg19. One of the updates I had to the pipeline was the automatic retrieval of barcode whitelists and unzipping. I noticed that one of the gunzip commands was creating an empty text file instead of unzipping as planned. I've fixed this up and plan to run the pipeline again tomorrow. Do you think an empty barcode whitelist could be the cause of the seg fault? I'll let you know of the update and if it works then post the solution on the git issue.
Kind regards,
James
…________________________________
From: Alexander Dobin <notifications@github.com>
Sent: 12 June 2020 21:18
To: alexdobin/STAR <STAR@noreply.github.com>
Cc: James Burgess <jamesburgess96@hotmail.com>; Author <author@noreply.github.com>
Subject: Re: [alexdobin/STAR] STARsolo 2.7.3a and 2.7.4a segmentation fault (core dumped). (#936)
Hi James,
this looks like a bug. Could you run the same example without the --twopassMode Basic, to see if it's the 2nd pass that causes the problem?
Cheers
Alex
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub<#936 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AKO3ME7CLRMAS46ZDLIQIK3RWJ5PLANCNFSM4NZHSZTQ>.
|
Hi James, I just checked that empty whitelist indeed causes a seg-fault. Hopefully, this will resolve the problem. I will patch it to exit with error for in this case. Cheers |
Hi Alex, indeed fixing the barcodes text file solved the issue! Thanks! Kind regards, |
Hi James, I added the check for empty whitelist in the 2.7.5a, now it will throw an error. Cheers |
Hi Alex, I hope you are well!
I am attempting to create a Snakemake pipeline to benchmark scRNA cell type annotations. STARsolo is the aligner I am using for the 10x Chromium V3/V2 paired end input FASTQ files. When aligning to the Hg19 human genome everything worked perfectly! However, the paper I am trying to benchmark against used reference genome Hg38. One of the V3 datasets that successfully completed before is now causing a segmentation fault (core dumped) error at the started mapping stage when using Hg38.
Links to reference files:
genome: ftp://ftp.ensembl.org/pub/release-100/fasta/homo_sapiens/dna/Homo_sapiens.GRCh38.dna.primary_assembly.fa.gz genes: ftp://ftp.ensembl.org/pub/release-100/gtf/homo_sapiens/Homo_sapiens.GRCh38.100.gtf.gz
Generating genome indices command:
"STAR --runMode genomeGenerate --runThreadN '{threads}' --sjdbGTFfile '{input.ref_gtf}' --genomeDir '{params.outdir}' " "--genomeFastaFiles '{input.ref_genome}' --limitGenomeGenerateRAM 31000000000 --genomeSAsparseD 3 --genomeSAindexNbases 14 " "--genomeChrBinNbits 18 --outFileNamePrefix '{params.outdir}_' --outTmpDir '{params.tmp_dir}'"
Running STARsolo command:
"STAR --genomeDir '{params.index_dir}' --sjdbGTFfile '{input.ref_gtf}' --readFilesIn '{input.cDNA_reads}' '{input.barcode_reads}' " "--runThreadN '{threads}' --twopassMode Basic --outWigType bedGraph --outSAMtype BAM SortedByCoordinate --limitBAMsortRAM 30000000000 " "--readFilesCommand zcat --runDirPerm '{params.run_dir_perm}' --outFileNamePrefix '{params.outdir}_' --soloType Droplet --soloCBwhitelist '{params.bc_whitelist}' " "--soloUMIlen '{params.UMI_len}' --outTmpDir '{params.tmp_dir}' --soloFeatures Gene"
STAR_index_Hg38_Log.out.txt
SRR10587810_Log.out.txt
Please find attached the logs for 2.7.3a as I know this version was successful for Hg19. I have tried both versions of 2.7.3a and 2.7.4a (after genome index re-generation) and both result in the same error at the same point. The machine I ran on initially had 8 cores and 31Gb of RAM. I have tried on the same machine but left 3 cores free but no luck. Additionally, I have ran on a 16 core machine with 64Gb RAM and still got the same error. I have ~90gb free space on the machine after genome index generation and the dataset is 13Gb in total so I do not think it could be caused by insufficient space to write temp files?
An example of the error output:
Jun 08 22:03:03 ..... inserting junctions into the genome indices Jun 08 22:04:15 ..... started mapping /bin/bash: line 1: 2419 Segmentation fault (core dumped) STAR --genomeDir '/home/ubuntu/workspace/scRNA-seq-benchmarking/Snakemake/Snakemake-scRNAseq-Output/STAR_indices/STAR_index_Hg38' --sjdbGTFfile '/home/ubuntu/workspace/scRNA-seq-benchmarking/Snakemake/Snakemake-scRNAseq-Output/reference_files/Hg38/Hg38_gtf.gtf' --readFilesIn '/home/ubuntu/workspace/scRNA-seq-benchmarking/Snakemake/Snakemake-scRNAseq-Output/SRR10587810/fastq_merged_lanes/SRR10587810_R2_merged.fastq.gz' '/home/ubuntu/workspace/scRNA-seq-benchmarking/Snakemake/Snakemake-scRNAseq-Output/SRR10587810/fastq_merged_lanes/SRR10587810_R1_merged.fastq.gz' --runThreadN '14' --twopassMode Basic --outWigType bedGraph --outSAMtype BAM SortedByCoordinate --readFilesCommand zcat --runDirPerm 'All_RWX' --outFileNamePrefix '/home/ubuntu/workspace/scRNA-seq-benchmarking/Snakemake/Snakemake-scRNAseq-Output/SRR10587810/STAR/STARsolo_output/SRR10587810_' --soloType Droplet --soloCBwhitelist '/home/ubuntu/workspace/scRNA-seq-benchmarking/Snakemake/Snakemake-scRNAseq-Output/reference_files/barcode_whitelists/V3_whitelist.txt' --soloUMIlen '12' --outTmpDir '/home/ubuntu/workspace/scRNA-seq-benchmarking/Snakemake/Snakemake-scRNAseq-Output/SRR10587810/STAR/STARsolo_output/tmp' --soloCellFilter None --soloFeatures Gene
Thanks in advance!
Kind regards,
James
The text was updated successfully, but these errors were encountered: