From 42b89fa0f50df589447842a3cb167e0c2fa119d8 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 20 Dec 2023 16:10:27 -0600 Subject: [PATCH 01/31] Fix format --- docs/13_upload.md | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index cdafa31e..2014c88e 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -90,10 +90,11 @@ two different servers differ slightly. dandi validate . dandi upload - Note that the `organize` steps should not be used if you are preparing a BIDS dataset with the NWB files. - Uploading to the development server is controlled via `-i` option, e.g. - `dandi upload -i dandi-staging`. - Note that validation is also done during `upload`, but ensuring compliance using `validate` prior upload helps avoid interruptions of the lengthier upload process due to validation failures. + - Note that the `organize` steps should not be used if you are preparing a BIDS dataset with the NWB files. + - Uploading to the development server is controlled via `-i` option, e.g. + `dandi upload -i dandi-staging`. + - Note that validation is also done during `upload`, but ensuring compliance using `validate` prior upload helps avoid interruptions of the lengthier upload process due to validation failures. + 6. Add metadata by visiting your Dandiset landing page: `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. From ae8c777908c9a0dd62a5cad584e817e791ba1b18 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 20 Dec 2023 16:47:06 -0600 Subject: [PATCH 02/31] Change logo link --- mkdocs.yml | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/mkdocs.yml b/mkdocs.yml index eca2fe2a..80cb84d3 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -54,3 +54,7 @@ markdown_extensions: plugins: - search - open-in-new-tab + +# Customize theme +extra: + - homepage: https://dandiarchive.org \ No newline at end of file From cbccee5022ae6a17ae3c71e4118de6d2fee3594d Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 20 Dec 2023 16:49:48 -0600 Subject: [PATCH 03/31] Fix syntax --- mkdocs.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/mkdocs.yml b/mkdocs.yml index 80cb84d3..d0714ba3 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -57,4 +57,4 @@ plugins: # Customize theme extra: - - homepage: https://dandiarchive.org \ No newline at end of file + homepage: https://dandiarchive.org \ No newline at end of file From d49a65eaede2aea31c5eb0a076a0a74dc4632dde Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 20 Dec 2023 19:58:01 -0600 Subject: [PATCH 04/31] Update upload instructions --- docs/13_upload.md | 31 ++++++++++++++++++++++++------- 1 file changed, 24 insertions(+), 7 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 2014c88e..7c510dda 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -51,7 +51,7 @@ two different servers differ slightly. Click `NEW DANDISET` in the Web application (top right corner) after logging in. After you provide a name and description, the dataset identifier will be created; we will call this ``. -1. NWB format: +1. NWB format 1. Convert your data to NWB 2.1+ in a local folder. Let's call this ``. We suggest beginning the conversion process using only a small amount of data so that common issues may be spotted earlier in the process. This step can be complex depending on your data. @@ -62,12 +62,13 @@ two different servers differ slightly. converters](https://bids.neuroimaging.io/benefits.html#converters) if you are preparing a BIDS dataset containing NWB files. Feel free to [reach out to us for help](https://github.com/dandi/helpdesk/discussions). - 2. Check your files for [NWB Best Practices](https://nwbinspector.readthedocs.io/en/dev/best_practices/best_practices_index.html) by installing + + 1. Check your files for [NWB Best Practices](https://nwbinspector.readthedocs.io/en/dev/best_practices/best_practices_index.html) by installing the [NWBInspector](https://nwbinspector.readthedocs.io/en/dev/user_guide/user_guide_index.html) (`pip install -U nwbinspector`) and running nwbinspector --config dandi - 3. Thoroughly read the NWBInspector report and try to address as many issues as possible. **DANDI will prevent validation and upload of any issues + 1. Thoroughly read the NWBInspector report and try to address as many issues as possible. **DANDI will prevent validation and upload of any issues labeled as level 'CRITICAL' or above when using the `--config dandi` option.** See ["Validation Levels for NWB Files"](./135_validation.md) for more information about validation criteria for @@ -79,9 +80,10 @@ two different servers differ slightly. nwbinspector --config dandi --report-file-path .txt - 4. Once your files are confirmed to adhere to the Best Practices, perform an official validation of the NWB files by running: `dandi validate --ignore DANDI.NO_DANDISET_FOUND `. + 1. Once your files are confirmed to adhere to the Best Practices, perform an official validation of the NWB files by running: `dandi validate --ignore DANDI.NO_DANDISET_FOUND `. **If you are having trouble with validation, make sure the conversions were run with the most recent version of `dandi`, `PyNWB` and `MatNWB`.** - 5. Now, prepare and fully validate again within the dandiset folder used for upload: + + 1. Now, prepare and fully validate again within the Dandiset folder used for upload: dandi download https://dandiarchive.org/dandiset//draft cd @@ -94,11 +96,26 @@ two different servers differ slightly. - Uploading to the development server is controlled via `-i` option, e.g. `dandi upload -i dandi-staging`. - Note that validation is also done during `upload`, but ensuring compliance using `validate` prior upload helps avoid interruptions of the lengthier upload process due to validation failures. + - If you have an issue using the `dandi` CLI, see the [Dandi Debugging section](./15_debugging.md). - 6. Add metadata by visiting your Dandiset landing page: + 1. Add metadata by visiting your Dandiset landing page: `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. -If you have an issue using the Python CLI, see the [Dandi Debugging section](./15_debugging.md). +1. BIDS format + 1. Once your files are confirmed to adhere to the best practices, perform an official validation of the BIDS files by running: `dandi validate --ignore DANDI.NO_DANDISET_FOUND `. + + + 1. Fully validate again within the Dandiset folder used for upload: + dandi validate . + dandi upload + + - Uploading to the development server is controlled via `-i` option, e.g. + `dandi upload -i dandi-staging`. + - Note that validation is also done during `upload`, but ensuring compliance using `validate` prior upload helps avoid interruptions of the lengthier upload process due to validation failures. + - If you have an issue using the `dandi` CLI, see the [Dandi Debugging section](./15_debugging.md). + + 1. Add metadata by visiting your Dandiset landing page: + `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. ## Storing Access Credentials From 0d7f061c1806ba3f36c26e98b677c08ac2b57460 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 20 Dec 2023 20:59:59 -0600 Subject: [PATCH 05/31] Update credentials section --- docs/13_upload.md | 83 +++++++++++++++++++++++++---------------------- 1 file changed, 44 insertions(+), 39 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 7c510dda..4c7b20c9 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -119,42 +119,47 @@ two different servers differ slightly. ## Storing Access Credentials -By default, the DANDI CLI looks for an API key in the `DANDI_API_KEY` -environment variable. To set this on Linux or macOS, run - -```bash -export DANDI_API_KEY=personal-key-value -``` -*Note that there are no spaces around the "=". - -If this is not set, the CLI will look up the API -key using the [keyring](https://github.com/jaraco/keyring) library, which -supports numerous backends, including the system keyring, an encrypted keyfile, -and a plaintext (unencrypted) keyfile. - -- You can store your API key where the `keyring` library can find it by using - the `keyring` program: Run `keyring set dandi-api-dandi key` and enter the - API key when asked for the password for `key` in `dandi-api-dandi`. - -- You can set the backend the `keyring` library uses either by setting the - `PYTHON_KEYRING_BACKEND` environment variable or by filling in [the `keyring` - library's configuration file](https://github.com/jaraco/keyring#configuring). - IDs for the available backends can be listed by running `keyring --list`. If - no backend is specified in this way, the library will use the available - backend with the highest priority. - -If the API key isn't stored in either the `DANDI_API_KEY` environment variable -or in the keyring, the CLI will prompt you to enter the API key, and then it -will store it in the keyring. This may cause you to be prompted further; you -may be asked to enter a password to encrypt/decrypt the keyring, or you may be -asked by your OS to confirm whether to give the DANDI CLI access to the -keyring. - -- If the DANDI CLI encounters an error while attempting to fetch the API key - from the default keyring backend, it will fall back to using an encrypted - keyfile (the `keyrings.alt.file.EncryptedKeyring` backend). If the keyfile - does not already exist, the CLI will ask you for confirmation; if you answer - "yes," the `keyring` configuration file (if it does not already exist; see - above) will be configured to use `EncryptedKeyring` as the default backend. - If you answer "no," the CLI will exit with an error, and you must store the - API key somewhere accessible to the CLI on your own. +There are three options for storing your DANDI access credentials. + +1. `DANDI_API_KEY` Environment Variable + + - By default, the DANDI CLI looks for an API key in the `DANDI_API_KEY` + environment variable. To set this on Linux or macOS, run: + + export DANDI_API_KEY=personal-key-value + + - Note that there are no spaces around the "=". + +1. `keyring` Library + - If the `DANDI_API_KEY` environment variable is not set, the CLI will look up the API + key using the [keyring](https://github.com/jaraco/keyring) library, which + supports numerous backends, including the system keyring, an encrypted keyfile, + and a plaintext (unencrypted) keyfile. + + - You can store your API key where the `keyring` library can find it by using + the `keyring` program: Run `keyring set dandi-api-dandi key` and enter the + API key when asked for the password for `key` in `dandi-api-dandi`. + + - You can set the backend the `keyring` library uses either by setting the + `PYTHON_KEYRING_BACKEND` environment variable or by filling in [the `keyring` + library's configuration file](https://github.com/jaraco/keyring#configuring). + IDs for the available backends can be listed by running `keyring --list`. If + no backend is specified in this way, the library will use the available + backend with the highest priority. + +1. Manual Password Entry + - If the API key isn't stored in either the `DANDI_API_KEY` environment variable + or in the keyring, the CLI will prompt you to enter the API key, and then it + will store it in the keyring. This may cause you to be prompted further; you + may be asked to enter a password to encrypt/decrypt the keyring, or you may be + asked by your OS to confirm whether to give the DANDI CLI access to the + keyring. + + - If the DANDI CLI encounters an error while attempting to fetch the API key + from the default keyring backend, it will fall back to using an encrypted + keyfile (the `keyrings.alt.file.EncryptedKeyring` backend). If the keyfile + does not already exist, the CLI will ask you for confirmation; if you answer + "yes," the `keyring` configuration file (if it does not already exist; see + above) will be configured to use `EncryptedKeyring` as the default backend. + If you answer "no," the CLI will exit with an error, and you must store the + API key somewhere accessible to the CLI on your own. From 2e5ee55957367fa33ddca330cec70ce3c7671926 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Thu, 21 Dec 2023 00:09:49 -0600 Subject: [PATCH 06/31] Update upload instructions --- docs/13_upload.md | 25 ++++++++++++++++++++++--- 1 file changed, 22 insertions(+), 3 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 4c7b20c9..320fb2df 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -51,6 +51,7 @@ two different servers differ slightly. Click `NEW DANDISET` in the Web application (top right corner) after logging in. After you provide a name and description, the dataset identifier will be created; we will call this ``. + 1. NWB format 1. Convert your data to NWB 2.1+ in a local folder. Let's call this ``. We suggest beginning the conversion process using only a small amount of data so that common issues may be spotted earlier in the process. @@ -80,8 +81,11 @@ two different servers differ slightly. nwbinspector --config dandi --report-file-path .txt - 1. Once your files are confirmed to adhere to the Best Practices, perform an official validation of the NWB files by running: `dandi validate --ignore DANDI.NO_DANDISET_FOUND `. - **If you are having trouble with validation, make sure the conversions were run with the most recent version of `dandi`, `PyNWB` and `MatNWB`.** + 1. Once your files are confirmed to adhere to the Best Practices, perform an official validation of the NWB files by running: + + dandi validate --ignore DANDI.NO_DANDISET_FOUND + + - If you are having trouble with validation, make sure the conversions were run with the most recent version of `dandi`, `PyNWB` and `MatNWB`.** 1. Now, prepare and fully validate again within the Dandiset folder used for upload: @@ -102,8 +106,23 @@ two different servers differ slightly. `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. 1. BIDS format - 1. Once your files are confirmed to adhere to the best practices, perform an official validation of the BIDS files by running: `dandi validate --ignore DANDI.NO_DANDISET_FOUND `. + 1. Convert your data to BIDS format in a local folder. Let's call this ``. + We suggest beginning the conversion process using only a small amount of data so that common issues may be spotted earlier in the process. + This step can be complex depending on your data. + [BIDS converters](https://bids.neuroimaging.io/benefits.html#converters) + automates conversion to BIDS from a variety of popular formats and the [BIDS Specification](https://bids-specification.readthedocs.io/) provides more information on the BIDS standard. + Feel free to [reach out to us for help](https://github.com/dandi/helpdesk/discussions). + + 1. Once your files are confirmed to adhere to the best practices, perform an official validation of the BIDS files by running: + + dandi validate --ignore DANDI.NO_DANDISET_FOUND + + 1. Download the Dandiset folder used for upload: + + dandi download https://dandiarchive.org/dandiset//draft + cd + 1. Move your `` (i.e. BIDS organized files) into the Dandiset folder. 1. Fully validate again within the Dandiset folder used for upload: dandi validate . From c19b6d14174fe8596da9006f01028ce58171a05f Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Thu, 21 Dec 2023 09:04:07 -0600 Subject: [PATCH 07/31] Update docs/13_upload.md Co-authored-by: Yaroslav Halchenko --- docs/13_upload.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 320fb2df..5c886e8f 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -110,7 +110,7 @@ two different servers differ slightly. We suggest beginning the conversion process using only a small amount of data so that common issues may be spotted earlier in the process. This step can be complex depending on your data. [BIDS converters](https://bids.neuroimaging.io/benefits.html#converters) - automates conversion to BIDS from a variety of popular formats and the [BIDS Specification](https://bids-specification.readthedocs.io/) provides more information on the BIDS standard. + automate conversion to BIDS from a variety of popular formats and the [BIDS Specification](https://bids-specification.readthedocs.io/) provides more information on the BIDS standard. Feel free to [reach out to us for help](https://github.com/dandi/helpdesk/discussions). 1. Once your files are confirmed to adhere to the best practices, perform an official validation of the BIDS files by running: From 148bb092a6ad1b5c30759324908adc4b747a92dc Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Thu, 21 Dec 2023 16:30:11 -0600 Subject: [PATCH 08/31] Update docs/13_upload.md Co-authored-by: Yaroslav Halchenko --- docs/13_upload.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 5c886e8f..862b11e0 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -105,8 +105,8 @@ two different servers differ slightly. 1. Add metadata by visiting your Dandiset landing page: `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. -1. BIDS format - 1. Convert your data to BIDS format in a local folder. Let's call this ``. +1. BIDS standard + 1. Convert your data to BIDS standard in a local folder. Let's call this ``. We suggest beginning the conversion process using only a small amount of data so that common issues may be spotted earlier in the process. This step can be complex depending on your data. [BIDS converters](https://bids.neuroimaging.io/benefits.html#converters) From 6e4c2ddfd821fb3d67f5055a9286229b65a31350 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Thu, 21 Dec 2023 16:47:46 -0600 Subject: [PATCH 09/31] Update upload instructions --- docs/13_upload.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 862b11e0..a2ed7f85 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -96,10 +96,11 @@ two different servers differ slightly. dandi validate . dandi upload - - Note that the `organize` steps should not be used if you are preparing a BIDS dataset with the NWB files. + - The `dandi organize` steps should not be used if you are preparing a BIDS dataset with the NWB files. + - Renaming files with `dandi organize` can be customized with the [--required-field](https://dandi.readthedocs.io/en/latest/cmdline/organize.html#cmdoption-required-field) option. - Uploading to the development server is controlled via `-i` option, e.g. `dandi upload -i dandi-staging`. - - Note that validation is also done during `upload`, but ensuring compliance using `validate` prior upload helps avoid interruptions of the lengthier upload process due to validation failures. + - Validation is also done during `upload`, but ensuring compliance using `validate` prior upload helps avoid interruptions of the lengthier upload process due to validation failures. - If you have an issue using the `dandi` CLI, see the [Dandi Debugging section](./15_debugging.md). 1. Add metadata by visiting your Dandiset landing page: From 880077f91d5757e7f8789c749aa2781a5e3625b1 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Thu, 21 Dec 2023 16:50:01 -0600 Subject: [PATCH 10/31] Update heading --- docs/13_upload.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index a2ed7f85..41371b53 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -44,7 +44,7 @@ two different servers differ slightly. 3. Store your API key somewhere that the CLI can find it; see ["Storing Access Credentials"](#storing-access-credentials) below. -### **Data upload/management workflow** +### **Data upload** 1. Register a Dandiset to generate an identifier. You will be asked to enter basic metadata: a name (title) and description (abstract) for your dataset. From 273aa69ee402712f476839feac4847e2022ae802 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Thu, 4 Jan 2024 15:40:38 -0600 Subject: [PATCH 11/31] Update docs/13_upload.md Co-authored-by: Yaroslav Halchenko --- docs/13_upload.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 41371b53..1ad74b83 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -114,7 +114,7 @@ two different servers differ slightly. automate conversion to BIDS from a variety of popular formats and the [BIDS Specification](https://bids-specification.readthedocs.io/) provides more information on the BIDS standard. Feel free to [reach out to us for help](https://github.com/dandi/helpdesk/discussions). - 1. Once your files are confirmed to adhere to the best practices, perform an official validation of the BIDS files by running: + 1. Once your files are confirmed to adhere to the BIDS standard, perform an official validation of the BIDS files by running: dandi validate --ignore DANDI.NO_DANDISET_FOUND From afed1b6a4c190eef7f0a72d0621355ece6b1d75a Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Thu, 4 Jan 2024 15:50:39 -0600 Subject: [PATCH 12/31] Update BIDS validator --- docs/13_upload.md | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 1ad74b83..74436dd6 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -111,12 +111,10 @@ two different servers differ slightly. We suggest beginning the conversion process using only a small amount of data so that common issues may be spotted earlier in the process. This step can be complex depending on your data. [BIDS converters](https://bids.neuroimaging.io/benefits.html#converters) - automate conversion to BIDS from a variety of popular formats and the [BIDS Specification](https://bids-specification.readthedocs.io/) provides more information on the BIDS standard. + automate conversion to BIDS from a variety of popular formats and the [BIDS specification](https://bids-specification.readthedocs.io/) provides more information on the BIDS standard. Feel free to [reach out to us for help](https://github.com/dandi/helpdesk/discussions). - 1. Once your files are confirmed to adhere to the BIDS standard, perform an official validation of the BIDS files by running: - - dandi validate --ignore DANDI.NO_DANDISET_FOUND + 1. Once your files are confirmed to adhere to the BIDS standard, perform an official validation of the BIDS files using the [BIDS validator](https://github.com/bids-standard/bids-validator). 1. Download the Dandiset folder used for upload: From 1c01a33c84f91284c6330e8ea60c72ade5d91a62 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 17 Jan 2024 14:41:53 -0600 Subject: [PATCH 13/31] Rename files --- docs/{10_using_dandi.md => 10_user_guide.md} | 0 docs/{135_validation.md => 135_validate.md} | 0 docs/{15_debugging.md => 15_debug.md} | 0 3 files changed, 0 insertions(+), 0 deletions(-) rename docs/{10_using_dandi.md => 10_user_guide.md} (100%) rename docs/{135_validation.md => 135_validate.md} (100%) rename docs/{15_debugging.md => 15_debug.md} (100%) diff --git a/docs/10_using_dandi.md b/docs/10_user_guide.md similarity index 100% rename from docs/10_using_dandi.md rename to docs/10_user_guide.md diff --git a/docs/135_validation.md b/docs/135_validate.md similarity index 100% rename from docs/135_validation.md rename to docs/135_validate.md diff --git a/docs/15_debugging.md b/docs/15_debug.md similarity index 100% rename from docs/15_debugging.md rename to docs/15_debug.md From 276dd99a826d3886684ecc9e8e49433f338df23c Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 17 Jan 2024 14:45:07 -0600 Subject: [PATCH 14/31] Update navigation --- ...ect_structure.md => 20_developer_guide.md} | 0 mkdocs.yml | 19 ++++++++++--------- requirements.txt | 1 + 3 files changed, 11 insertions(+), 9 deletions(-) rename docs/{20_project_structure.md => 20_developer_guide.md} (100%) diff --git a/docs/20_project_structure.md b/docs/20_developer_guide.md similarity index 100% rename from docs/20_project_structure.md rename to docs/20_developer_guide.md diff --git a/mkdocs.yml b/mkdocs.yml index d0714ba3..e2aa243d 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -23,17 +23,17 @@ nav: - Introduction: "01_introduction.md" - Data Standards: "30_data_standards.md" - Data Licenses: "35_data_licenses.md" - - User Guide: - - Using DANDI: "10_using_dandi.md" - - Viewing Dandisets: "11_view.md" - - Downloading Data and Dandisets: "12_download.md" - - Creating Dandisets and Uploading Data: "13_upload.md" - - Validation Levels for NWB Files: "135_validation.md" - - Publishing Dandisets: "14_publish.md" - - Debugging: "15_debugging.md" + - User Guide: + - "10_user_guide.md" + - View Dandisets: "11_view.md" + - Download Dandisets: "12_download.md" + - Create Dandisets: "13_upload.md" + - Validate NWB Files: "135_validate.md" + - Publish Dandisets: "14_publish.md" + - Debug: "15_debug.md" - DANDI CLI and Python API: https://dandi.readthedocs.io - Developer Guide: - - Project Structure: "20_project_structure.md" + - "20_developer_guide.md" - Notes: "40_development.md" - REST API Swagger: https://api.dandiarchive.org/swagger - REST API Redoc: https://api.dandiarchive.org/redoc @@ -54,6 +54,7 @@ markdown_extensions: plugins: - search - open-in-new-tab + - section-index # Customize theme extra: diff --git a/requirements.txt b/requirements.txt index 49a1908c..c65718ae 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,3 +1,4 @@ mkdocs-material pymdown-extensions mkdocs-open-in-new-tab +mkdocs-section-index \ No newline at end of file From 0738b7863ec0ff68e43edf039b7ad67be5cc8a9e Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 17 Jan 2024 15:18:35 -0600 Subject: [PATCH 15/31] Update footer --- mkdocs.yml | 18 +++++++++++++++++- 1 file changed, 17 insertions(+), 1 deletion(-) diff --git a/mkdocs.yml b/mkdocs.yml index e2aa243d..cbe71285 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -58,4 +58,20 @@ plugins: # Customize theme extra: - homepage: https://dandiarchive.org \ No newline at end of file + homepage: https://dandiarchive.org + generator: false # Disable watermark + social: + - icon: fontawesome/brands/slack + link: https://dandiarchive.slack.com + name: Slack + - icon: fontawesome/solid/paper-plane + link: mailto:info@dandiarchive.org + - icon: fontawesome/brands/x-twitter + link: https://twitter.com/dandiarchive + name: X-Twitter + - icon: fontawesome/brands/github + link: https://github.com/dandi + name: GitHub + - icon: fontawesome/brands/youtube + link: https://www.youtube.com/@dandiarchive + name: YouTube \ No newline at end of file From 33f4687f93dde23f8e49b06ae13bc56438e41eda Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 17 Jan 2024 15:43:47 -0600 Subject: [PATCH 16/31] Update to imperative-style --- docs/10_user_guide.md | 4 +++- docs/11_view.md | 2 +- docs/12_download.md | 2 +- docs/135_validate.md | 3 ++- 4 files changed, 7 insertions(+), 4 deletions(-) diff --git a/docs/10_user_guide.md b/docs/10_user_guide.md index 864d9e14..b4993215 100644 --- a/docs/10_user_guide.md +++ b/docs/10_user_guide.md @@ -1,10 +1,11 @@ -# Using DANDI +# User Guide Overview DANDI allows you to work with stored neurophysiology data in multiple ways. You can search, view, and download files, all without registering for a DANDI account. As a registered user, you can also create these collections of data along with metadata and publish them to the DANDI platform. ## Dandisets + DANDI stores cellular neurophysiology data in Dandisets. A Dandiset is a collection of assets (files and their metadata) and metadata about the collection. @@ -44,6 +45,7 @@ alt="download_file_icon"/> to download a [Download](./12_download.md) section to install and use the DANDI Python client tool. ### Next steps + Although anyone on the Internet can view and download public Dandisets, registered users can also create Dandisets, upload data, and publish the Dandiset to generate a DOI for it. diff --git a/docs/11_view.md b/docs/11_view.md index b9a55697..f959f38f 100644 --- a/docs/11_view.md +++ b/docs/11_view.md @@ -1,4 +1,4 @@ -# Viewing Dandisets +# View Dandisets ## Browse Dandisets diff --git a/docs/12_download.md b/docs/12_download.md index 3b3a13be..8bfdda0a 100644 --- a/docs/12_download.md +++ b/docs/12_download.md @@ -1,4 +1,4 @@ -# Downloading Data and Dandisets +# Download Data and Dandisets You can download the content of a Dandiset using the DANDI Web application (such a specific file) or entire Dandisets using the DANDI Python CLI. diff --git a/docs/135_validate.md b/docs/135_validate.md index 4b84b46b..2b3011e0 100644 --- a/docs/135_validate.md +++ b/docs/135_validate.md @@ -1,4 +1,4 @@ -# Validation Levels for NWB Files +# Validate NWB Files To be accepted by DANDI, NWB files must conform to criteria that are enforced via three levels of validation: ## NWB File Validation @@ -18,6 +18,7 @@ Inspector software as opposed to the NWB file. ## Missing DANDI Metadata DANDI has requirements for metadata beyond what is strictly required for NWB validation. The following metadata must be present in the NWB file for a successful upload to DANDI: + - You must define a `Subject` object. - The `Subject` object must have a `subject_id` attribute. - The `Subject` object must have a `species` attribute. This can either be the Latin binomial, e.g. "Mus musculus", or From 5576ecdc74e3c390ce6f601df13faa06dd55e46c Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 17 Jan 2024 15:46:00 -0600 Subject: [PATCH 17/31] Update doi --- docs/10_user_guide.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/10_user_guide.md b/docs/10_user_guide.md index b4993215..53b92de5 100644 --- a/docs/10_user_guide.md +++ b/docs/10_user_guide.md @@ -110,7 +110,7 @@ Note that `Dandihub` is not intended for significant computation, but provides a You can add the following statement to the methods section of your manuscript. > Data and associated metadata were uploaded to the DANDI archive [RRID:SCR_017571] using - the Python command line tool (https://doi.org/10.5281/zenodo.7041535). The data were first + the Python command line tool (https://doi.org/10.5281/zenodo.3692138) . The data were first converted into the NWB format (https://doi.org/10.1101/2021.03.13.435173) and organized into a BIDS-like (https://doi.org/10.1038/sdata.2016.44) structure. @@ -118,4 +118,4 @@ You can refer to DANDI using any of the following options: * Using an RRID [RRID:SCR_017571](https://scicrunch.org/scicrunch/Resources/record/nlx_144509-1/SCR_017571/resolver). -* Using the DANDI CLI reference: https://doi.org/10.5281/zenodo.7041535 +* Using the DANDI CLI reference: https://doi.org/10.5281/zenodo.3692138 From 7595e4d8ef91e1b19d9cbef319cf0e5b5831375b Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 17 Jan 2024 20:28:25 -0600 Subject: [PATCH 18/31] Move section --- docs/13_upload.md | 96 ++++++++++++++++++++++++----------------------- 1 file changed, 49 insertions(+), 47 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index 74436dd6..cfc40e4c 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -42,9 +42,55 @@ two different servers differ slightly. pip install -U dandi 3. Store your API key somewhere that the CLI can find it; see ["Storing - Access Credentials"](#storing-access-credentials) below. + Access Credentials"](#store-access-credentials) below. + +### **Store Access Credentials** + +There are three options for storing your DANDI access credentials. + +1. `DANDI_API_KEY` Environment Variable + + - By default, the DANDI CLI looks for an API key in the `DANDI_API_KEY` + environment variable. To set this on Linux or macOS, run: + + export DANDI_API_KEY=personal-key-value + + - Note that there are no spaces around the "=". + +1. `keyring` Library + - If the `DANDI_API_KEY` environment variable is not set, the CLI will look up the API + key using the [keyring](https://github.com/jaraco/keyring) library, which + supports numerous backends, including the system keyring, an encrypted keyfile, + and a plaintext (unencrypted) keyfile. + + - You can store your API key where the `keyring` library can find it by using + the `keyring` program: Run `keyring set dandi-api-dandi key` and enter the + API key when asked for the password for `key` in `dandi-api-dandi`. + + - You can set the backend the `keyring` library uses either by setting the + `PYTHON_KEYRING_BACKEND` environment variable or by filling in [the `keyring` + library's configuration file](https://github.com/jaraco/keyring#configuring). + IDs for the available backends can be listed by running `keyring --list`. If + no backend is specified in this way, the library will use the available + backend with the highest priority. + +1. Manual Password Entry + - If the API key isn't stored in either the `DANDI_API_KEY` environment variable + or in the keyring, the CLI will prompt you to enter the API key, and then it + will store it in the keyring. This may cause you to be prompted further; you + may be asked to enter a password to encrypt/decrypt the keyring, or you may be + asked by your OS to confirm whether to give the DANDI CLI access to the + keyring. + + - If the DANDI CLI encounters an error while attempting to fetch the API key + from the default keyring backend, it will fall back to using an encrypted + keyfile (the `keyrings.alt.file.EncryptedKeyring` backend). If the keyfile + does not already exist, the CLI will ask you for confirmation; if you answer + "yes," the `keyring` configuration file (if it does not already exist; see + above) will be configured to use `EncryptedKeyring` as the default backend. + If you answer "no," the CLI will exit with an error, and you must store the + API key somewhere accessible to the CLI on your own. -### **Data upload** 1. Register a Dandiset to generate an identifier. You will be asked to enter basic metadata: a name (title) and description (abstract) for your dataset. @@ -135,49 +181,5 @@ two different servers differ slightly. 1. Add metadata by visiting your Dandiset landing page: `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. -## Storing Access Credentials - -There are three options for storing your DANDI access credentials. - -1. `DANDI_API_KEY` Environment Variable - - - By default, the DANDI CLI looks for an API key in the `DANDI_API_KEY` - environment variable. To set this on Linux or macOS, run: - - export DANDI_API_KEY=personal-key-value - - - Note that there are no spaces around the "=". +### **Upload data** -1. `keyring` Library - - If the `DANDI_API_KEY` environment variable is not set, the CLI will look up the API - key using the [keyring](https://github.com/jaraco/keyring) library, which - supports numerous backends, including the system keyring, an encrypted keyfile, - and a plaintext (unencrypted) keyfile. - - - You can store your API key where the `keyring` library can find it by using - the `keyring` program: Run `keyring set dandi-api-dandi key` and enter the - API key when asked for the password for `key` in `dandi-api-dandi`. - - - You can set the backend the `keyring` library uses either by setting the - `PYTHON_KEYRING_BACKEND` environment variable or by filling in [the `keyring` - library's configuration file](https://github.com/jaraco/keyring#configuring). - IDs for the available backends can be listed by running `keyring --list`. If - no backend is specified in this way, the library will use the available - backend with the highest priority. - -1. Manual Password Entry - - If the API key isn't stored in either the `DANDI_API_KEY` environment variable - or in the keyring, the CLI will prompt you to enter the API key, and then it - will store it in the keyring. This may cause you to be prompted further; you - may be asked to enter a password to encrypt/decrypt the keyring, or you may be - asked by your OS to confirm whether to give the DANDI CLI access to the - keyring. - - - If the DANDI CLI encounters an error while attempting to fetch the API key - from the default keyring backend, it will fall back to using an encrypted - keyfile (the `keyrings.alt.file.EncryptedKeyring` backend). If the keyfile - does not already exist, the CLI will ask you for confirmation; if you answer - "yes," the `keyring` configuration file (if it does not already exist; see - above) will be configured to use `EncryptedKeyring` as the default backend. - If you answer "no," the CLI will exit with an error, and you must store the - API key somewhere accessible to the CLI on your own. From 5f3d5ac40090bb1399021b0fac21a0b608b385a5 Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 17 Jan 2024 22:03:04 -0600 Subject: [PATCH 19/31] Update upload page --- docs/13_upload.md | 66 +++++++++++++++++++++++------------------------ 1 file changed, 33 insertions(+), 33 deletions(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index cfc40e4c..c7e89bda 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -1,15 +1,12 @@ -# Creating Dandisets and Uploading Data +# Create Dandisets To create a new Dandiset and upload your data, you need to have a DANDI account. -## Create an Account on DANDI +## Create a DANDI Account -To create a DANDI account: - -1. [Create a GitHub account](https://github.com/) if you don't have one. -2. Using your GitHub account, [register a DANDI account](https://gui.dandiarchive.org/#/user/register). - -You will receive an email acknowledging activation of your account within 24 +1. To create a DANDI account, first [create a GitHub account](https://github.com/) if you don't have one. +1. Using your GitHub account, [register a DANDI account](https://gui.dandiarchive.org/#/user/register). +1. You will receive an email acknowledging activation of your account within 24 hours, after which you can log in to DANDI using GitHub by clicking the login button. @@ -28,7 +25,7 @@ the production server. The development server is primarily used by users learnin The below instructions will alert you to where the commands for interacting with these two different servers differ slightly. -### **Setup** +### **Install DANDI Client** 1. Log in to DANDI and copy your API key. Click on your user initials in the top-right corner after logging in. Production (dandiarchive.org) and staging (gui-staging.dandiarchive.org) servers @@ -91,12 +88,19 @@ There are three options for storing your DANDI access credentials. If you answer "no," the CLI will exit with an error, and you must store the API key somewhere accessible to the CLI on your own. +### **Register a Dandiset** + +Register a Dandiset to generate an identifier. + +1. After logging in on https://dandiarchive.org, click the `NEW DANDISET` button in the top right corner. +1. You will be asked to enter basic metadata for your dataset: + 1. Title (i.e. name) + 1. Description (i.e. abstract) + 1. License +1. The dataset identifier will be created; we will call this ``. + +### **Standardize data** -1. Register a Dandiset to generate an identifier. You will be asked to enter - basic metadata: a name (title) and description (abstract) for your dataset. - Click `NEW DANDISET` in the Web application (top right corner) after logging in. - After you provide a name and description, the dataset identifier will be created; - we will call this ``. 1. NWB format 1. Convert your data to NWB 2.1+ in a local folder. Let's call this ``. @@ -139,18 +143,10 @@ There are three options for storing your DANDI access credentials. cd dandi organize -f dry dandi organize - dandi validate . - dandi upload - The `dandi organize` steps should not be used if you are preparing a BIDS dataset with the NWB files. - Renaming files with `dandi organize` can be customized with the [--required-field](https://dandi.readthedocs.io/en/latest/cmdline/organize.html#cmdoption-required-field) option. - - Uploading to the development server is controlled via `-i` option, e.g. - `dandi upload -i dandi-staging`. - - Validation is also done during `upload`, but ensuring compliance using `validate` prior upload helps avoid interruptions of the lengthier upload process due to validation failures. - - If you have an issue using the `dandi` CLI, see the [Dandi Debugging section](./15_debugging.md). - - 1. Add metadata by visiting your Dandiset landing page: - `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. + - If you have an issue using the `dandi` CLI, see the [Dandi Debugging section](./15_debug.md). 1. BIDS standard 1. Convert your data to BIDS standard in a local folder. Let's call this ``. @@ -169,17 +165,21 @@ There are three options for storing your DANDI access credentials. 1. Move your `` (i.e. BIDS organized files) into the Dandiset folder. - 1. Fully validate again within the Dandiset folder used for upload: - dandi validate . - dandi upload - - Uploading to the development server is controlled via `-i` option, e.g. - `dandi upload -i dandi-staging`. - - Note that validation is also done during `upload`, but ensuring compliance using `validate` prior upload helps avoid interruptions of the lengthier upload process due to validation failures. - - If you have an issue using the `dandi` CLI, see the [Dandi Debugging section](./15_debugging.md). +### **Upload data** - 1. Add metadata by visiting your Dandiset landing page: - `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. +1. Validate the Dandiset folder, and begin upload: -### **Upload data** + dandi validate . + dandi upload + +1. Note: + 1. Upload to the development server with the `-i` option, e.g. + `dandi upload -i dandi-staging`. + 1. Validation is also done during `upload`, but ensuring compliance using `validate` prior to upload helps avoid interruptions of the lengthier upload process due to validation failures. + 1. If you have an issue using the `dandi` CLI, see the [Debug section](./15_debug.md). + +### **Add Dandiset metadata** +1. Add metadata by visiting your Dandiset landing page at + `https://dandiarchive.org/dandiset//draft` and clicking on the `METADATA` link. \ No newline at end of file From 80362ea79054fedaf74dbc731c3a648e9ef1368e Mon Sep 17 00:00:00 2001 From: bendichter Date: Thu, 7 Sep 2023 11:25:30 +0200 Subject: [PATCH 20/31] add docs on the hub --- docs/50_hub.md | 32 ++++++++++++++++++++++++++++++++ mkdocs.yml | 1 + 2 files changed, 33 insertions(+) create mode 100644 docs/50_hub.md diff --git a/docs/50_hub.md b/docs/50_hub.md new file mode 100644 index 00000000..6ec699fb --- /dev/null +++ b/docs/50_hub.md @@ -0,0 +1,32 @@ +# Using the DANDI Hub + +The DANDI Hub is a JupyterHub instance in the cloud to interact with the data stored in DANDI, and is free to use for exploratory analysis of data on DANDI. Note that DANDI Hub is not intended for significant computation, but provides a place to introspect Dandisets and to perform some analysis and visualization of data. + +## Registration + +To use the [DANDI Hub](http://hub.dandiarchive.org), you must first register for an account using the [DANDI website](http://dandiarchive.org). + +## Choosing a server option + +When you start up the DANDI Hub, you will be asked to select across a number of server options. For basic exploration, Tiny or Base would most likely be appropriate. The DANDI Hub also currently offers Medium and Large options, which have more available memory and compute power. The "T4 GPU inference" server comes with an associated T4 GPU, and is intended to be used for applications that require GPU for inference. We request that users of this server be considerate of their usage of the DANDI Hub as a free community resource. Training large deep neural networks is not appropriate. A "Base (MATLAB)" server is also available, which provides a MATLAB cloud installation. + +## Example notebooks + +The best way to share analyses on DANDI data is through the DANDI example notebooks. These notebooks are organized by `///`. Dandiset contributors are encouraged to use these notebooks to demonstrate how to read, analyze, and visualize the data, and how to produce figures from associated scientific publications. + +### Contributing an example notebook + +Notebooks can be submitted as a Pull Request to the [DANDI example-notebooks repository](https://github.com/catalystneuro/example-notebooks). + +#### Environment specification +Best practice is to include one or more notebooks alongside an environment.yml file, which provides a conda-style specification of the environment required to run the notebooks. +1. Create a new environment: `conda create -n -python ` +2. Use `conda install ` and `pip install ` to install the necessary dependencies until the notebook runs through successfully. +3. Export the environment: `conda env export > environment.yml`. + +See detailed instructions for creating a `environment.yml` file [here](https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#sharing-an-environment). + +#### File organization +When constructing the Pull Request, ensure that you have the proper directory structure: `///`. If you share more than one notebook, also include a `README.md` file at `///README.md` explaining the purpose of each notebook and providing context and links to relevant publications. + +Once this Pull Requests is accepted, your contributed notebook will be available to all DANDI Hub users. \ No newline at end of file diff --git a/mkdocs.yml b/mkdocs.yml index cbe71285..af55504c 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -37,6 +37,7 @@ nav: - Notes: "40_development.md" - REST API Swagger: https://api.dandiarchive.org/swagger - REST API Redoc: https://api.dandiarchive.org/redoc + - DANDI Hub: "50_hub.md" - Terms and Policies: - Terms: "about/terms.md" - Policies: "about/policies.md" From eef8a6dd8274f2bf5232da6ddce14cb7cf80df91 Mon Sep 17 00:00:00 2001 From: bendichter Date: Thu, 7 Sep 2023 16:52:57 +0200 Subject: [PATCH 21/31] remove specific submission instructions, since they are now a PR on the example-notebooks repo --- docs/50_hub.md | 15 --------------- 1 file changed, 15 deletions(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index 6ec699fb..df3ea70d 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -14,19 +14,4 @@ When you start up the DANDI Hub, you will be asked to select across a number of The best way to share analyses on DANDI data is through the DANDI example notebooks. These notebooks are organized by `///`. Dandiset contributors are encouraged to use these notebooks to demonstrate how to read, analyze, and visualize the data, and how to produce figures from associated scientific publications. -### Contributing an example notebook - -Notebooks can be submitted as a Pull Request to the [DANDI example-notebooks repository](https://github.com/catalystneuro/example-notebooks). - -#### Environment specification -Best practice is to include one or more notebooks alongside an environment.yml file, which provides a conda-style specification of the environment required to run the notebooks. -1. Create a new environment: `conda create -n -python ` -2. Use `conda install ` and `pip install ` to install the necessary dependencies until the notebook runs through successfully. -3. Export the environment: `conda env export > environment.yml`. - -See detailed instructions for creating a `environment.yml` file [here](https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#sharing-an-environment). - -#### File organization -When constructing the Pull Request, ensure that you have the proper directory structure: `///`. If you share more than one notebook, also include a `README.md` file at `///README.md` explaining the purpose of each notebook and providing context and links to relevant publications. - Once this Pull Requests is accepted, your contributed notebook will be available to all DANDI Hub users. \ No newline at end of file From 4dab099a2c25041de18ba3f64b519086de5b3a32 Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Thu, 7 Sep 2023 06:25:35 -0400 Subject: [PATCH 22/31] Update docs/50_hub.md --- docs/50_hub.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index df3ea70d..e7cd7751 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -1,6 +1,6 @@ # Using the DANDI Hub -The DANDI Hub is a JupyterHub instance in the cloud to interact with the data stored in DANDI, and is free to use for exploratory analysis of data on DANDI. Note that DANDI Hub is not intended for significant computation, but provides a place to introspect Dandisets and to perform some analysis and visualization of data. +[DANDI Hub](http://hub.dandiarchive.org) is a JupyterHub instance in the cloud to interact with the data stored in DANDI, and is free to use for exploratory analysis of data on DANDI. Note that DANDI Hub is not intended for significant computation, but provides a place to introspect Dandisets and to perform some analysis and visualization of data. ## Registration From 38e3662bc9a3467b45cbd9c222de6722768d0dae Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Mon, 18 Dec 2023 11:04:39 -0500 Subject: [PATCH 23/31] Update docs/50_hub.md Co-authored-by: Yaroslav Halchenko --- docs/50_hub.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index e7cd7751..72d12aae 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -1,6 +1,7 @@ # Using the DANDI Hub -[DANDI Hub](http://hub.dandiarchive.org) is a JupyterHub instance in the cloud to interact with the data stored in DANDI, and is free to use for exploratory analysis of data on DANDI. Note that DANDI Hub is not intended for significant computation, but provides a place to introspect Dandisets and to perform some analysis and visualization of data. +[DANDI Hub](http://hub.dandiarchive.org) is a JupyterHub instance in the cloud to interact with the data stored in DANDI, and is free to use for exploratory analysis of data on DANDI. +Note that DANDI Hub is not intended for significant computation, but provides a place to introspect Dandisets and to perform some analysis and visualization of data. ## Registration From 89602074ab34caf6a542f4f99848b7a4a98d0b8c Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Mon, 18 Dec 2023 11:05:34 -0500 Subject: [PATCH 24/31] Update docs/50_hub.md Co-authored-by: Yaroslav Halchenko --- docs/50_hub.md | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index 72d12aae..5b72f3e8 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -9,7 +9,13 @@ To use the [DANDI Hub](http://hub.dandiarchive.org), you must first register for ## Choosing a server option -When you start up the DANDI Hub, you will be asked to select across a number of server options. For basic exploration, Tiny or Base would most likely be appropriate. The DANDI Hub also currently offers Medium and Large options, which have more available memory and compute power. The "T4 GPU inference" server comes with an associated T4 GPU, and is intended to be used for applications that require GPU for inference. We request that users of this server be considerate of their usage of the DANDI Hub as a free community resource. Training large deep neural networks is not appropriate. A "Base (MATLAB)" server is also available, which provides a MATLAB cloud installation. +When you start up the DANDI Hub, you will be asked to select across a number of server options. +For basic exploration, Tiny or Base would most likely be appropriate. +The DANDI Hub also currently offers Medium and Large options, which have more available memory and compute power. +The "T4 GPU inference" server comes with an associated T4 GPU, and is intended to be used for applications that require GPU for inference. +We request that users of this server be considerate of their usage of the DANDI Hub as a free community resource. +Training large deep neural networks is not appropriate. +A "Base (MATLAB)" server is also available, which provides a MATLAB cloud installation but you would be required to provide your own license. ## Example notebooks From 2603fe238ecf2183d8ea1171a9223129c15ccaf1 Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Mon, 18 Dec 2023 11:09:36 -0500 Subject: [PATCH 25/31] Update docs/50_hub.md Co-authored-by: Yaroslav Halchenko --- docs/50_hub.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index 5b72f3e8..a19aefed 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -19,6 +19,8 @@ A "Base (MATLAB)" server is also available, which provides a MATLAB cloud instal ## Example notebooks -The best way to share analyses on DANDI data is through the DANDI example notebooks. These notebooks are organized by `///`. Dandiset contributors are encouraged to use these notebooks to demonstrate how to read, analyze, and visualize the data, and how to produce figures from associated scientific publications. +The best way to share analyses on DANDI data is through the DANDI example notebooks. +These notebooks are maintained in https://github.com/dandi/example-notebooks repository which provides more information about their organization. +Dandiset contributors are encouraged to use these notebooks to demonstrate how to read, analyze, and visualize the data, and how to produce figures from associated scientific publications. Once this Pull Requests is accepted, your contributed notebook will be available to all DANDI Hub users. \ No newline at end of file From d139b54ba882a7e460dea229cc30e1232cdd2784 Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Sat, 23 Dec 2023 15:23:01 -0500 Subject: [PATCH 26/31] Update docs/50_hub.md Co-authored-by: Yaroslav Halchenko --- docs/50_hub.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index a19aefed..5e584a9d 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -1,6 +1,6 @@ # Using the DANDI Hub -[DANDI Hub](http://hub.dandiarchive.org) is a JupyterHub instance in the cloud to interact with the data stored in DANDI, and is free to use for exploratory analysis of data on DANDI. +[DANDI Hub](http://hub.dandiarchive.org) is a [JupyterHub](https://jupyterhub.readthedocs.io) instance in the cloud to interact with the data stored in DANDI, and is free to use for exploratory analysis of data on DANDI. Note that DANDI Hub is not intended for significant computation, but provides a place to introspect Dandisets and to perform some analysis and visualization of data. ## Registration From 311ef3383dff9612b903cd78b348378789de9053 Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Sat, 23 Dec 2023 15:23:35 -0500 Subject: [PATCH 27/31] Update docs/50_hub.md Co-authored-by: Kabilar Gunalan --- docs/50_hub.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index 5e584a9d..214cb3ba 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -20,7 +20,7 @@ A "Base (MATLAB)" server is also available, which provides a MATLAB cloud instal ## Example notebooks The best way to share analyses on DANDI data is through the DANDI example notebooks. -These notebooks are maintained in https://github.com/dandi/example-notebooks repository which provides more information about their organization. +These notebooks are maintained in the [dandi/example-notebooks](https://github.com/dandi/example-notebooks) repository which provides more information about their organization. Dandiset contributors are encouraged to use these notebooks to demonstrate how to read, analyze, and visualize the data, and how to produce figures from associated scientific publications. Once this Pull Requests is accepted, your contributed notebook will be available to all DANDI Hub users. \ No newline at end of file From 156d3a3b1dbb59520452c1370ad86ad3d19e9599 Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Sat, 23 Dec 2023 15:24:03 -0500 Subject: [PATCH 28/31] Update docs/50_hub.md Co-authored-by: Kabilar Gunalan --- docs/50_hub.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index 214cb3ba..4530abd0 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -23,4 +23,4 @@ The best way to share analyses on DANDI data is through the DANDI example notebo These notebooks are maintained in the [dandi/example-notebooks](https://github.com/dandi/example-notebooks) repository which provides more information about their organization. Dandiset contributors are encouraged to use these notebooks to demonstrate how to read, analyze, and visualize the data, and how to produce figures from associated scientific publications. -Once this Pull Requests is accepted, your contributed notebook will be available to all DANDI Hub users. \ No newline at end of file +Notebooks can be added and updated through a pull request to the [dandi/example-notebooks](https://github.com/dandi/example-notebooks) repository. Once the pull request is merged, your contributed notebook will be available to all DANDI Hub users. \ No newline at end of file From 5b6bb09b6d6c70a79f4ad9ae33614bb2f108103f Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Tue, 9 Jan 2024 09:00:27 -0500 Subject: [PATCH 29/31] Update docs/50_hub.md Co-authored-by: Yaroslav Halchenko --- docs/50_hub.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/50_hub.md b/docs/50_hub.md index 4530abd0..304fd87e 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -23,4 +23,5 @@ The best way to share analyses on DANDI data is through the DANDI example notebo These notebooks are maintained in the [dandi/example-notebooks](https://github.com/dandi/example-notebooks) repository which provides more information about their organization. Dandiset contributors are encouraged to use these notebooks to demonstrate how to read, analyze, and visualize the data, and how to produce figures from associated scientific publications. -Notebooks can be added and updated through a pull request to the [dandi/example-notebooks](https://github.com/dandi/example-notebooks) repository. Once the pull request is merged, your contributed notebook will be available to all DANDI Hub users. \ No newline at end of file +Notebooks can be added and updated through a pull request to the [dandi/example-notebooks](https://github.com/dandi/example-notebooks) repository. +Once the pull request is merged, your contributed notebook will be available to all DANDI Hub users. \ No newline at end of file From 0a84d0161aeb65d366cd5ebfc74fa91246388da2 Mon Sep 17 00:00:00 2001 From: Ben Dichter Date: Tue, 9 Jan 2024 09:14:53 -0500 Subject: [PATCH 30/31] Update docs/50_hub.md Co-authored-by: Kabilar Gunalan --- docs/50_hub.md | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/50_hub.md b/docs/50_hub.md index 304fd87e..4a523f5f 100644 --- a/docs/50_hub.md +++ b/docs/50_hub.md @@ -1,6 +1,7 @@ # Using the DANDI Hub [DANDI Hub](http://hub.dandiarchive.org) is a [JupyterHub](https://jupyterhub.readthedocs.io) instance in the cloud to interact with the data stored in DANDI, and is free to use for exploratory analysis of data on DANDI. +For instructions on how to navigate JupyterHub see this [YouTube tutorial](https://www.youtube.com/watch?v=5pf0_bpNbkw&t=09m20s). Note that DANDI Hub is not intended for significant computation, but provides a place to introspect Dandisets and to perform some analysis and visualization of data. ## Registration From 2891674e8c63513aad50bb6cc66527ae005fe0eb Mon Sep 17 00:00:00 2001 From: Kabilar Gunalan Date: Wed, 17 Jan 2024 22:09:03 -0600 Subject: [PATCH 31/31] [WIP] Update upload page --- docs/13_upload.md | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/docs/13_upload.md b/docs/13_upload.md index c7e89bda..d4bb1ea4 100644 --- a/docs/13_upload.md +++ b/docs/13_upload.md @@ -101,8 +101,9 @@ Register a Dandiset to generate an identifier. ### **Standardize data** +1. Standardize your dataset -1. NWB format +1. NWB format or BIDS with NWB files 1. Convert your data to NWB 2.1+ in a local folder. Let's call this ``. We suggest beginning the conversion process using only a small amount of data so that common issues may be spotted earlier in the process. This step can be complex depending on your data. @@ -163,8 +164,13 @@ Register a Dandiset to generate an identifier. dandi download https://dandiarchive.org/dandiset//draft cd + dandi://…/dandiset.yaml + dandi download dandi://{text_dandiset.api.instance_id}/{dandiset_id}/dandiset.yaml 1. Move your `` (i.e. BIDS organized files) into the Dandiset folder. + 1. Check your files with the BIDS Validator and try to address as many issues as possible. + + dandi validate --ignore DANDI.NO_DANDISET_FOUND ### **Upload data**