Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update docs #181

Merged
merged 5 commits into from
Nov 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 5 additions & 8 deletions docs/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,15 +28,12 @@ to make sure it is properly installed.
Basic installation
==================

This package requires Python 3.7+. Assuming you have the correct version of
Python installed, you can install ``neuromaps`` by opening a terminal and
running the following:
Currently, ``neuromaps`` works with Python 3.8+.
You can install stable versions of ``neuromaps`` from PyPI with ``pip install neuromaps``.
However, we recommend installing from the source repository to get the latest features and bug fixes.

.. .. code-block:: bash

.. pip install neuromaps

.. Alternatively, you can install the most up-to-date version of from GitHub:
You can install ``neuromaps`` from the source repository with ``pip install git+https://github.com/netneurolab/neuromaps.git``
or by cloning the repository and installing from the local directory:

.. code-block:: bash

Expand Down
39 changes: 29 additions & 10 deletions docs/user_guide/nulls.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,12 @@ significance of the association between the tested maps. Enter: the

This module provides access to a variety of null models that can be used to
generate "null" brain maps that retain aspects of the spatial autocorrelation
of the original brain maps. (For a review of these models refer to `Markello &
Misic, 2021, NeuroImage <https://doi.org/10.1016/j.neuroimage.2021.118052>`_.)
of the original brain maps.

For a review of these models, please refer to
`Markello & Misic, 2021, NeuroImage <https://doi.org/10.1016/j.neuroimage.2021.118052>`_.
We also recommend watching `this recorded session <https://www.youtube.com/watch?v=6DjpNddINZ8>`_
from the OHBM 2024 Educational Course if you are new to this topic.

There are four available null models that can be used with voxel- and
vertex-wise data and eight null models that can be used with parcellated data.
Expand Down Expand Up @@ -161,6 +165,11 @@ non-significant.
Nulls for volumetric data
-------------------------

.. warning::
Nulls for high-resolution volumetric data (especially at 1mm or 2mm resolution) can
be **extremely** demanding (days & hundreds of GBs). This is an inherent limitation
of the original model that currently has no immediate workaround!

The majority of spatial nulls work best with data represented in one of the
surface-based coordinate systems. If you are working with data that are
represented in the MNI152 system you must use one of the following three null
Expand All @@ -186,12 +195,22 @@ You would call the functions in the same manner as above:
>>> print(nulls.shape)
(224705, 100)

However, this process will take much more time than for equivalent data
represented in a surface-based system, and will need to store the full distance
matrix out as a temporary file (potentially many GB of disk space!). If
possible it is recommended that you mask your data (i.e., with a gray matter
mask) before generating nulls using this procedure.

Note that you can provide parcellation images for volumetric data as described
above! Simply pass the volumetric parcellation image to the ``parcellation``
keyword argument and the function will take care of the rest.
When working with volumetric data, please note some important computational
considerations. While the function supports both voxelwise and parcellated analyses,
processing high-resolution volumetric data (especially at 1mm or 2mm resolution) can
be **extremely** demanding. The calculations for voxelwise data can take several days
to complete even on high-performance computing nodes, and may require hundreds of GBs
of temporary storage space. This is an inherent limitation of the original model that
currently has no immediate workaround (see `BrainSMASH <https://github.com/murraylab/brainsmash>`_).
We welcome any suggestions for improving this method's computational efficiency and
performance.

To make your analysis more tractable, we recommend you consider using parcellated
data instead of voxelwise analysis. Parcellation dramatically reduces both computation
time and storage requirements.

For voxelwise input, if possible it is recommended that you mask your data
(i.e., with a gray matter mask) before generating nulls using this procedure. To use
parcellation images for volumetric data, simply pass the volumetric parcellation image
to the ``parcellation`` keyword argument and the function will take care of the rest.