Skip to content

Commit

Permalink
Merge pull request #338 from swaythe/eg_mod
Browse files Browse the repository at this point in the history
Eg mod
  • Loading branch information
swaythe authored Nov 30, 2020
2 parents c5161ba + a0dcb61 commit 53336b6
Show file tree
Hide file tree
Showing 10 changed files with 19 additions and 273 deletions.
3 changes: 1 addition & 2 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,9 +63,8 @@ jobs:
echo 'export SPM_MCR="$SPM_ROOT_DIR/spm12.sh"' >> $BASH_ENV
echo 'export PATH="/home/circleci/.local/bin:$PATH"' >> $BASH_ENV
pip install --upgrade pip
pip install scipy sklearn nibabel configobj coverage pytest -q
pip install scipy sklearn nibabel nilearn configobj coverage pytest -q
pip install matplotlib pandas nipype --ignore-installed -q
git clone https://github.com/swaythe/nilearn.git; cd nilearn; pip install -e . -q; cd ..
python setup.py install --user
python -c "from pypreprocess import datasets; datasets.fetch_spm_auditory(); datasets.fetch_spm_multimodal_fmri(); datasets.fetch_fsl_feeds()"
sudo chown -R $USER:$USER /home/circleci/
Expand Down
14 changes: 5 additions & 9 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ pypreprocess
* support for precompiled SPM (besides the usual matlab-dependent flavor).

pypreprocess relies on nipype's interfaces to SPM (both precompiled SPM and matlab-dependent SPM flavors). It also has pure-Python (no C extensions, no compiled code, just Python) modules and scripts for slice-timing correction, motion correction, coregistration, and smoothing, without need for nipype or matlab.
It has been developed in the Linux environment and is tested with Ubuntu 16 and 18. No guarantees with other OSes.

License
=======
Expand Down Expand Up @@ -55,18 +56,15 @@ To begin with, you may also want to install the pre-compiled version of SPM (in

Second, install the python packages pip, scipy, pytest, nibabel, sklearn, nipype, pandas, matplotlib, nilearn and configobj. If you have a python virtual environment, just run::

$ pip install scipy sklearn nibabel configobj coverage pytest matplotlib pandas nipype
$ git clone https://github.com/swaythe/nilearn.git; cd nilearn; pip install -e . ; cd ..
$ pip install scipy sklearn nibabel nilearn configobj coverage pytest matplotlib pandas nipype

If not, make sure to install pip (run: 'sudo apt-get install python-pip'). If you want to install these locally, use the --user option::

$ pip install scipy sklearn nibabel configobj coverage pytest matplotlib pandas nipype --ignore-installed --user
$ git clone https://github.com/swaythe/nilearn.git; cd nilearn; pip install -e . --user; cd ..
$ pip install scipy sklearn nibabel nilearn configobj coverage pytest matplotlib pandas nipype --ignore-installed --user

If you want to install these for all users, use sudo::

$ pip install scipy sklearn nibabel configobj coverage pytest matplotlib pandas nipype --ignore-installed
$ git clone https://github.com/swaythe/nilearn.git; cd nilearn; pip install -e . --user; cd ..
$ pip install scipy sklearn nibabel nilearn configobj coverage pytest matplotlib pandas nipype --ignore-installed

Finally, install pypreprocess itself by running the following in the pypreprocess::

Expand Down Expand Up @@ -103,7 +101,7 @@ If you find nipype errors like "could not configure SPM", this is most likely th

Layout of examples
==================
We have written some examplary scripts for preprocessing some popular datasets.
We have written some example scripts for preprocessing some popular datasets.
The **examples** directory contains a set of scripts, each demoing an aspect of pypreprocessing. Some scripts even provide use-cases for the nipy-based GLM. The examples use publicly available sMRI and fMRI data. Data fetchers are based on the nilearn API.
The main examples scripts can be summarized as follows:

Expand All @@ -119,8 +117,6 @@ More advanced examples

* **examples/pipelining/nistats_glm_fsl_feeds_fmri.py**: demos preprocessing + first-level GLM on FSL FEEDS dataset using nistats python package.

* **examples/pipelining/nipype_preproc_spm_nyu.py**: preprocessing of NYU resting-state dataset

Examples using pure Python (no SPM, FSL, etc. required)
-------------------------------------------------------
* **examples/pure_python/slice_timing_demos.py, examples/pure_python/realign_demos.py, examples/pure_python/coreg_demos.py**: demos Slice-Timing Correction (STC), motion-correction, and coregistration on various datasets, using modules written in pure Python
Expand Down
3 changes: 2 additions & 1 deletion examples/easy_start/nipype_preproc_spm_auditory.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,8 @@
import pandas as pd
from nilearn.glm.first_level.design_matrix import (make_first_level_design_matrix,
check_design_matrix)
from nilearn.reporting import plot_design_matrix

from nilearn.plotting.matrix_plotting import plot_design_matrix
from nilearn.glm.first_level import FirstLevelModel
import matplotlib.pyplot as plt

Expand Down
2 changes: 1 addition & 1 deletion examples/easy_start/nipype_preproc_spm_haxby.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@

# fetch HAXBY dataset
N_SUBJECTS = 2
haxby_data = fetch_haxby(n_subjects=N_SUBJECTS)
haxby_data = fetch_haxby(subjects=N_SUBJECTS)

# set output dir
OUTPUT_DIR = os.path.join(os.path.dirname(haxby_data.mask),
Expand Down
27 changes: 0 additions & 27 deletions examples/pipelining/nipype_preproc_spm_nyu.py

This file was deleted.

206 changes: 0 additions & 206 deletions examples/pipelining/nyu_rest_preproc.ini

This file was deleted.

20 changes: 1 addition & 19 deletions examples/pure_python/coreg_demos.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,19 +36,6 @@ def _spm_auditory_factory():
sd = fetch_spm_auditory()
return sd.func[0], sd.anat


def _abide_factory(institute="KKI"):
for scans in sorted(glob.glob(
"/home/elvis/CODE/datasets/ABIDE/%s_*/%s_*/scans" % (
institute, institute))):
subject_id = os.path.basename(os.path.dirname(os.path.dirname(scans)))
func = os.path.join(scans, "rest/resources/NIfTI/files/rest.nii")
anat = os.path.join(scans,
"anat/resources/NIfTI/files/mprage.nii")

yield subject_id, func, anat


def _nyu_rest_factory(session=1):
from pypreprocess.nipype_preproc_spm_utils import SubjectData

Expand Down Expand Up @@ -79,7 +66,7 @@ def _nyu_rest_factory(session=1):
# set subject output directory
subject_data.output_dir = "/tmp/%s" % subject_id

subject_data.sanitize(deleteorient=True, niigz2nii=False)
subject_data.sanitize(deleteorient=False, niigz2nii=False)

yield (subject_data.subject_id, subject_data.func[0],
subject_data.anat)
Expand All @@ -91,8 +78,3 @@ def _nyu_rest_factory(session=1):
for subject_id, func, anat in _nyu_rest_factory():
print("%s +++NYU rest %s+++\r\n" % ("\t" * 5, subject_id))
mem.cache(_run_demo)(func, anat)

# ABIDE demo
for subject_id, func, anat in _abide_factory():
print("%s +++ABIDE %s+++\r\n" % ("\t" * 5, subject_id))
mem.cache(_run_demo)(func, anat)
5 changes: 3 additions & 2 deletions examples/pure_python/pure_python_preproc_demo.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@
sd = fetch_spm_multimodal_fmri()
sd.output_dir = "/tmp/sub001"
sd.func = [sd.func1, sd.func2]
sd.session_output_dirs = ["/tmp/sub001/session1", "/tmp/sub001/session2"]

# preproc data
do_subject_preproc(sd.__dict__, concat=False, coregister=True, stc=True,
tsdiffana=True, realign=True, report=True, reslice=True)
do_subject_preproc(sd, concat=False, coregister=True, stc=True,
tsdiffana=True, realign=True, report=False, reslice=True)
4 changes: 2 additions & 2 deletions examples/pure_python/slice_timing_demos.py
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ def demo_HRF(output_dir, n_slices=10,
# create time values scaled at 1%
timescale = .01
n_timepoints = 24
time = np.linspace(0, n_timepoints, num=1 + (n_timepoints - 0) / timescale)
time = np.linspace(0, n_timepoints, num=int(1 + (n_timepoints - 0) / timescale))

# create gamma functions
n1 = 4
Expand Down Expand Up @@ -202,7 +202,7 @@ def compute_hrf(t):

# sample the time and the signal
freq = 100
TR = 3.
TR = 3
acquisition_time = time[::TR * freq]
n_scans = len(acquisition_time)

Expand Down
Loading

0 comments on commit 53336b6

Please sign in to comment.