-
Notifications
You must be signed in to change notification settings - Fork 91
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to download .h5ad data #254
Comments
@miguelreinacampos this is indeed concerning. Can you try using the files from figshare instead? https://figshare.com/projects/Tabula_Muris_Senis/64982 |
This error also occurs when trying to download the BAM files. It seems that they have been stored in DEEP_ARCHIVE so they cannot be downloaded. |
I wanted to add that I am also unable to download anything off of AWS. I am getting "ERROR 403: Forbidden." when copying the AWS link and using wget, and an AccessDenied error when using the aws command-line tool. |
I got exactly the same error when using aws comman line tool in a Ubuntu system. With command 'aws s3 cp --no-sign-request "s3://czb-tabula-muris/TM_droplet_mat.rds" ./ --storage-class STANDARD --recursive --force-glacier-transfer' did not work either. |
Hi there, I was wondering whether anyone has managed to download the "complete count files as sparse matrices in .rds format for easy loading into R", TM_facs_mat.rds and TM_droplet_mat.rds or in h5ad file format (https://s3.amazonaws.com/czbiohub-tabula-muris/TM_droplet_mat.h5ad) and here eventually? I see that @aopisco kindly tried to help us by suggesting the https://figshare.com/projects/Tabula_Muris_Senis/64982 link as an alternative to that and, also, that based on the closed issue #245, she has reinforced that TMS would contain TM as being the 3-month timepoint. However, when exploring either the corresponding AWS "working" bucket (https://s3.console.aws.amazon.com/s3/buckets/czb-tabula-muris-senis?region=us-west-2&tab=objects) or the mentioned figshare's one, those specific matrices files are not available (unless I have overlooked them and I apologize if that was the case). Would @aopisco or @jamestwebber be able to kindly point us out what files in the working AWS bucket would be equivalent to those matrices? Again, apologies if I've missed anything and thanks in advance for your attention! All the best, Antonio. |
Angela and I are no longer at the Biohub, so we don't have any more permissions than you these days. Going to ping @AhmetCanSolak as someone who is still working there and might at least know the right person to contact these days. |
Hi James,
many thanks for your prompt reply and help! Sorry for the hassle!
Cheers,
Antonio
…On Thu, Oct 31, 2024 at 3:29 PM James Webber ***@***.***> wrote:
Angela and I are no longer at the Biohub, so we don't have any more
permissions than you these days. Going to ping @AhmetCanSolak
<https://github.com/AhmetCanSolak> as someone who is still working there
and might at least know the right person to contact these days.
—
Reply to this email directly, view it on GitHub
<#254 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABPYVMFILJ4UXM3LZ6AWA43Z6JEHFAVCNFSM6AAAAABBYUXRSWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJQGE3TINRRGU>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Hello,
I would like to access the .h5ad data files for analysis with python.
I tried downloading the .h5ad files from the AWS bucket, but these files are on GLACIER storage. I don't have the permission to initiate a retrieval process.
I followed the links mentioned here:
"You can download complete count files as sparse matrices using anndata's h5ad file format for use in Python here and here. You can process the resulting AnnData object using, for instance, Scanpy."
But, that both link to an error:
This XML file does not appear to have any style information associated with it. The document tree is shown below.
InvalidObjectState
The operation is not valid for the object's storage class
DEEP_ARCHIVE
35XDHRHT4AEQJWB8
ukU3SU4rWfcmR/skYBWH79JQ5g6QPeytgdozy9gPyF8PoyuQxlOc9qGAGUqHdkpWLOv2N5Wyu7A=
Thank you so much in advance for your help
Miguel Reina-Campos
The text was updated successfully, but these errors were encountered: