Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOC: First documentation pass #383

Merged
merged 8 commits into from
Aug 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions docs/community.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
## NiPreps Community
# NiPreps Community

Check out the [official NiPreps community page](https://www.nipreps.org/community/), where topics such as contributing, code of conduct, and licensing are outlined.


## NiBabies Coding Style

### Pre-commit
Expand Down
115 changes: 64 additions & 51 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,51 +6,55 @@

import os
import sys
from datetime import datetime
from sphinx import __version__ as sphinxversion
from datetime import datetime, timezone

from packaging.version import Version, parse
from sphinx import __version__ as sphinxversion

import nibabies

# -- Path setup --------------------------------------------------------------
here = os.path.dirname(__file__)
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
sys.path.append(os.path.join(here, "sphinxext"))
sys.path.insert(0, os.path.join(here, "..", "wrapper"))
sys.path.append(os.path.join(here, 'sphinxext'))
sys.path.insert(0, os.path.join(here, '..', 'wrapper'))

from github_link import make_linkcode_resolve # this is only available after sphinxext to PATH
# this is only available after sphinxext to PATH
from github_link import make_linkcode_resolve # noqa: E402

# -- General configuration ---------------------------------------------------

# If your documentation needs a minimal Sphinx version, state it here.
needs_sphinx = "1.5.3"
needs_sphinx = '1.5.3'

# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
"sphinx.ext.autodoc",
"sphinx.ext.doctest",
"sphinx.ext.intersphinx",
"sphinx.ext.coverage",
"sphinx.ext.mathjax",
"sphinx.ext.linkcode",
"sphinx.ext.napoleon",
"sphinxarg.ext", # argparse extension
"nipype.sphinxext.plot_workflow",
"myst_nb", # stop segregating rst/md
'sphinx.ext.autodoc',
'sphinx.ext.doctest',
'sphinx.ext.intersphinx',
'sphinx.ext.coverage',
'sphinx.ext.mathjax',
'sphinx.ext.linkcode',
'sphinx.ext.napoleon',
'sphinxcontrib.bibtex',
'sphinxarg.ext', # argparse extension
'nipype.sphinxext.plot_workflow',
'myst_parser', # allow markdown
# 'sphinx-togglebutton', # collapse admonitions
]

autodoc_mock_imports = [
"numpy",
"nibabel",
"nilearn"
]
if parse(sphinxversion) >= parse("1.7.0"):
autodoc_mock_imports += [
"pandas",
"nilearn",
"seaborn",
bibtex_bibfiles = ['../nibabies/data/boilerplate.bib']

autodoc_mock_imports = ['numpy', 'nibabel', 'nilearn']
if parse(sphinxversion) >= parse('1.7.0'):
autodoc_mock_imports = [
'pandas',
'nilearn',
'seaborn',
]

# Add any paths that contain templates here, relative to this directory.
Expand All @@ -62,14 +66,33 @@
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']


source_suffix = [".rst", ".md"]
source_suffix = ['.rst', '.md']

# -- Options for HTML output -------------------------------------------------

# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = 'sphinx_rtd_theme'
html_theme = 'shibuya'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice 🙂 +1 !!!!


# Options specific to theme
html_theme_options = {
'color_mode': 'light',
'dark_code': True,
'github_url': 'https://github.com/nipreps/nibabies',
'nav_links': [
{
'title': 'NiPreps Homepage',
'url': 'https://nipreps.org',
'external': True,
},
{
'title': 'Docker Hub',
'url': 'https://hub.docker.com/r/nipreps/nibabies',
'external': True,
},
],
}

# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
Expand All @@ -83,40 +106,30 @@
# https://github.com/sphinx-contrib/napoleon/pull/10 is merged.
napoleon_use_param = False
napoleon_custom_sections = [
("Inputs", "Parameters"),
("Outputs", "Parameters"),
('Inputs', 'Parameters'),
('Outputs', 'Parameters'),
]

# -- MyST parameters ---------------------------------------------------------

myst_heading_anchors = 3
myst_enable_extensions = [
"colon_fence",
"substitution",
'colon_fence',
'substitution',
]

linkcode_resolve = make_linkcode_resolve("nibabies",
"https://github.com/nipreps/"
"nibabies/blob/{revision}/"
"{package}/{path}#L{lineno}")
linkcode_resolve = make_linkcode_resolve(
'nibabies',
'https://github.com/nipreps/' 'nibabies/blob/{revision}/' '{package}/{path}#L{lineno}',
)

project = "NiBabies"
author = "The NiPreps developers"
copyright = "2021-%s, %s" % (datetime.now().year, author)
project = 'NiBabies'
author = 'The NiPreps developers'

import nibabies
copyright = f'2021-{datetime.now(tz=timezone.utc)}, {author}'

nibabies_ver = Version(nibabies.__version__)
release = "version" if nibabies_ver.is_prerelease else nibabies_ver.public

myst_substitutions = {
"release": release,
"version": str(nibabies_ver),
"dockerbuild": "docker pull nipreps/nibabies:{{ release }}",
"singbuild": (
"singularity build nibabies-{{ release }}.sif docker://nipreps/nibabies:{{ release }}"
),
}
release = 'version' if nibabies_ver.is_prerelease else nibabies_ver.public

# to avoid Python highlighting in literal text
highlight_language = "none"
highlight_language = 'none'
59 changes: 36 additions & 23 deletions docs/faqs.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,61 +2,74 @@

## Leveraging precomputed results

Whether manual intervention is required, or you want to reduce processing time, *NiBabies* allow the use of certain pre-computed files during processing, which can skip part of the workflow.
Support is limited to the following files:
- Anatomical mask in T1w or T2w space
- Antomical segmentation (aseg) in T1w or T2w space
Whether manual intervention is required, or you want to break up processing, *NiBabies* can reuse previously-computed files (either from NiBabies directly or a third party application) to be injected into the workflow directly.

:::{versionchanged} 24.0.0

In addition to the brain mask and anatomical segmentation, support was added for additional precomputed derivatives. To see which derivatives are supported, view [](outputs.md#anatomical-derivatives).
:::

To use pre-computed results, one or more [BIDS Derivatives](https://bids-specification.readthedocs.io/en/stable/05-derivatives/01-introduction.html#bids-derivatives) directories must be passed in to *NiBabies* using the `--derivatives` flag.
Derivative directories must include a [`dataset_description.json` and the required fields](https://bids-specification.readthedocs.io/en/stable/03-modality-agnostic-files.html#derived-dataset-and-pipeline-description).
Additionally, files must include the `space-T1w` or `space-T2w` key-value pair in the filenames, and a matching sidecar JSON file with the `SpatialReference` field defined.

A sample layout of a derivatives directory can be found below:

```
```bash
my_precomputed/
├── dataset_description.json
└── sub-01
└── anat
├── sub-01_desc-preproc_T2w.nii.gz
├── sub-01_space-T2w_desc-aseg_dseg.json
├── sub-01_space-T2w_desc-aseg_dseg.nii.gz
├── sub-01_space-T2w_desc-brain_mask.json
└── sub-01_space-T2w_desc-brain_mask.nii.gz
```

and the contents of the JSON files:
```
{"SpatialReference": "sub-01/anat/sub-01_T2w.nii.gz"}
```
In this example, `sub-01_desc-preproc_T2w.nii.gz` will be used as the T2w reference. The other files (the brain mask and segmentation), will be in the same space.

:::{warning}
If no anatomical reference is provided, the outputs must be in the same space as the raw anatomical data.
:::

The `SpatialReference` file will be used to ensure the raw data and the derivatives are aligned and in the same space.
:::{note}
If an aseg is provided, it will be used for surface generation.
:::

## Multi-atlas segmentation with joint label fusion

By default, *NiBabies* will run [FSL FAST](https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FAST) for tissue segmentation, and Infant FreeSurfer for segmentation labels.
However, you can instead use ANTs Joint Label Fusion to generate both, granted you provide multiple atlases with anatomicals / segmentations via the `--segmentation-atlases-dir` flag.

Alternatively, ANTs {abbr}`JLF (Joint Label Fusion)` can be used by providing a directory with one or more template images composed of anatomicals and segmentations. To pass in this directory, use the `--segmentation-atlases-dir` flag.
When using this approach, there are a few assumptions being made:

1. The anatomicals are brain masked.
1. The labeled segmentations follow the [FreeSurfer lookup table](https://surfer.nmr.mgh.harvard.edu/fswiki/FsTutorial/AnatomicalROI/FreeSurferColorLUT).
1. The segmentation labels adhere to the [FreeSurfer LUT](https://surfer.nmr.mgh.harvard.edu/fswiki/FsTutorial/AnatomicalROI/FreeSurferColorLUT).

Here is an example layout of what the `--segmentation-atlases-dir` flag expects:

```
$ tree JLF-templates
JLF-templates/
├── Template01
│   ├── Segmentation.nii.gz
│   ├── T1w_brain.nii.gz
│   └── T2w_brain.nii.gz
└── Template02
├── Segmentation.nii.gz
├── T1w_brain.nii.gz
└── T2w_brain.nii.gz
```bash
$ tree JLF-atlases

JLF-atlases/
├── dataset_description.json
├── participants.tsv
├── sub-01
│ ├── sub-01_desc-aseg_dseg.nii.gz
│ ├── [sub-01_T1w.json] * optional
│ ├── sub-01_T1w.nii.gz
│ ├── [sub-01_T2w.json] * optional
│ └── sub-01_T2w.nii.gz
├── sub-02
...
```

## More context on releases

Like other *NiPreps*, *NiBabies* follows Calendar Versioning ([CalVer](https://calver.org/)), in format of `YY.MINOR.MICRO`.
In short, here is a quick heuristic on how new releases should be looked at:

1. If the `YY` or `MINOR` has changed, it is a feature release, with substantial changes to the workflow.
1. If the `YY.MINOR` match the version you used, but the `MICRO` has changed, it is a bug-fix release.
Check the [release notes](https://github.com/nipreps/nibabies/releases) - if the fixes do not pertain to your data, there is no need to upgrade.
Expand Down
16 changes: 7 additions & 9 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,14 @@
```{include} ../README.md
:relative-docs: docs/
:relative-images:
```

## Contents
# Table of Contents

```{toctree}
:maxdepth: 2
:maxdepth: 3

installation
usage
faqs
outputs
community
installation.md
usage.md
faqs.md
outputs.md
community.md
```
40 changes: 24 additions & 16 deletions docs/installation.md
Original file line number Diff line number Diff line change
@@ -1,44 +1,52 @@
# Installation

The latest release of *NiBabies* is {{ release }}.

To view all available releases, refer to the [NiBabies PyPI page](https://pypi.org/project/nibabies/#history).
There are two ways to install *NiBabies*:
- using container technologies; or
- within a manually prepared environment, also known as *bare-metal*.

## Container Installation

Given its extensive dependencies, the easiest way to get up and running with *NiBabies* is by using a container service, such as [Docker](https://www.docker.com/get-started) or [Singularity](https://sylabs.io/singularity/).
Given its extensive dependencies, the easiest way to get up and running with *NiBabies* is by using a container service, such as [Docker](https://www.docker.com/get-started) or [Apptainer](https://apptainer.org/).

### Working with Docker

Images are hosted on our [Docker Hub](https://hub.docker.com/r/nipreps/nibabies).
To pull an image, the specific version tag must be specified in order to pull the images.
For example, to pull the first release in the 24.0.0 series, you can do:

:::{admonition} Example Docker build
:class: seealso

$ {{ dockerbuild }}
:::
```shell
docker pull nipreps/nibabies:24.0.0
```

There are also a few keyword tags, `latest` and `unstable`, that serve as special pointers.
`latest` points to the latest release (excluding any betas or release candidates).
`unstable` points to the most recent developmental change, and should only be used to test new features or fixes.

### Working with Singularity
:::{tip}
`latest` will pull the most recent release, but beware that it will not be updated until calling the docker pull command again. For this reason, it is recommended to pull using the explicit version tag.
:::

The easiest way to create a Singularity image is to build from the [Docker](#working-with-docker) images hosted online.
### Working with Apptainer (formerly Singularity)

:::{admonition} Example Singularity build
:class: seealso
Visit the [apptainer containers page](https://datasets.datalad.org/?dir=/repronim/containers/images/bids), courtesy of DataLad and ReproNim, to download already created images.

$ {{ singbuild }}
:::{tip}
Images are listed as `bids-nibabies--<version>.sing`, where `<version>` is the release tag.
:::

Otherwise, you can create an Apptainer image from the [Docker](#working-with-docker) images hosted online.

```bash
apptainer build nibabies-24.0.0.sif docker://nipreps/nibabies:24.0.0
```

## Installing the nibabies-wrapper

The `nibabies-wrapper` is a lightweight Python tool to facilitate running `nibabies` within a container service.
To install or upgrade to the current release:
```
$ pip install --update nibabies-wrapper

```bash
pip install --update nibabies-wrapper
```

For further details, see [](usage.md#using-the-nibabies-wrapper).
Expand Down
Loading