Skip to content

kjamison/nemo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

87 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nemo_logo

NeMo 2.1 - Network Modification Tool

Predict brain network disruption from a lesion mask. Original concept described in Kuceyeski 2013.

NEW! Cloud interface for this tool can be used here: https://kuceyeski-wcm-web.s3.amazonaws.com/nemo/index.html

DOI

Contents

  1. Workflow overview
  2. Inputs
  3. Outputs
    1. Code examples for parsing outputs
  4. Website usage
  5. Requirements
  6. Details of tractography database
  7. Parcellations

Workflow overview

The general workflow for this tool consists of a database generation stage and a lesion disconnectivity stage:

  1. Tractography database generation: databaseprep/
    1. Compute whole-brain tractogram streamlines for 420 unrelated healthy subjects from the Human Connectome Project (See hcp_subjects_unrelated420_scfc.txt for list)
      • Both probabilistic (iFOD2+ACT) and deterministic (SD_STREAM) are available
    2. Nonlinearly warp streamlines into a common reference space (MNI152 v6, eg: $FSLDIR/data/standard/MNI152_T1_1mm.nii.gz, or website/atlases/MNI152_T1_1mm_brain.nii.gz)
    3. Additional processing to facillitate efficient computation: run_full_database_prep.sh
  2. Lesion to disconnectivity mapping: nemo_lesion_to_chaco.py
    1. Given a lesion mask, identify all streamlines it intersects, and identify the endpoints of those streamlines to compute brain regions for which we expect a reduction of connectivity
    2. Compute ChaCo (Change in Connectivity) score, which is the ratio of (disrupted streamlines)/(total streamlines) for each voxel or ROI (chacovol), or voxel/ROI pair (chacoconn). 0=no disconnection. 1=complete disconnection

We have created a user-friendly web interface to run this tool in the cloud (AWS):

  • Main web GUI code in uploader.js
  • Uploading via this website triggers an AWS Lambda event, which executes s3-lambda.py to launch an AWS EC2 instance
  • On the EC2 instance, nemo_startup.sh manages the entire input/output workflow, and uploads the results to S3, triggering a second AWS Lambda event in s3-lambda.py that emails the user a link to the results

Inputs

  • NOTE: All input volumes must already be transformed into 1mm MNI152 v6 space (eg: using FSL's FNIRT or ANTs)
    • 182x218x182 voxels (best) or 181x217x181 (this sometimes results if upsampling SPM 2mm output to 1mm)
    • Volumes matching either of these 2 sizes are then nearest-neighbor interpolated to the MNI template to avoid any voxel ordering issues.
  • Lesion mask = NIfTI volume (*.nii.gz or *.nii)
  • Parcellation (optional) = NIfTI volume with labeled voxels (*.nii.gz or *.nii)
    • Note: Pipeline only considers sequential ROI label values. For example, a parcellation containing only label values [10,20,30,40] will produce an 4x1 output, or a 4x4 output in pairwise mode
  • Resolution = mm resolution for outputs. Default=1mm, but this leads to large output files for the pairwise chacoconn and 420-subject *_allref.pkl
    • e.g., For a single very extensive lesion mask, chacovol_allref can be as large as 700MB, and chacoconn_allref can be 10s of GB
  • Currently, this package treats the lesion volume as a binary mask (0 = healthy tissue, <0 or >0 = tissue damage)

Outputs

  • chacovol = voxelwise or regionwise ChaCo ratio

  • chacoconn = pairwise ChaCo ratio of loss of connections between pairs of voxels and/or ROIs

    • Note: for parcellations, these will be upper triangular. For voxelwise (including downsampled), this is not guaranteed
  • _chacovol_(mean|stdev) = mean and stdev of all 420 HCP-subject ChaCo ratio maps (for voxelwise outputs, these are .nii.gz files)

  • _chacoconn_(mean|stdev) = mean and stdev of all 420 HCP-subject pairwise disconnectivity maps

  • _nemoSC_(mean|stdev) = mean and stdev of predicted pairwise structural connectivity after removing disconnected streamlines for all 420 HCP-subject

  • chacovol_allref.pkl = ChaCo ratio map for each of the 420 HCP reference subjects

    • 420x(voxels or ROIs) sparse matrix format
  • chacoconn_allref.pkl = ChaCo ratio map for each of the 420 HCP reference subjects

    • 420-element list of (voxels x voxels) or (ROIs x ROIs) sparse matrices
  • *_allref_denom.pkl = denominator of the ChaCo ratio for each subject (useful when recomputing ChaCo numerator and denominator for reparcellating ratios)

  • *.pkl are Python pickle format that can be read using: import pickle; data = pickle.load(open("filename.pkl","rb"))

  • *.npz are SciPy sparse matrices that can be read using: import numpy as np; from scipy import sparse; data = sparse.load_npz("filename.npz")

  • To convert outputs to another format, you will need to use Python. See examples below.

Code examples for parsing outputs

Load chacovol_mean.pkl file and save as text/tsv/csv

import pickle
import numpy as np
data = pickle.load(open("mylesion_chacovol_mean.pkl","rb"))
#data contains a single ROIxROI matrix
np.savetxt("mylesion_chacovol_mean.txt",data,delimiter="\t")

#or save as .mat
from scipy.io import savemat
savemat("mylesion_chacovol_mean.mat",{"data":data})
# (load in matlab with M=load("mylesion_chacovol_mean.mat"); data=M.data;)

Load chacoconn_mean.pkl file and save as text/tsv/csv

#NOTE: chacoconn files are *SPARSE* format and must be converted to *DENSE* before saving to text using data.toarray()
import pickle
import numpy as np
data = pickle.load(open("mylesion_chacoconn_mean.pkl","rb"))
np.savetxt("mylesion_chacoconn_mean.txt",data.toarray(),delimiter="\t")

#or save dense/full matrix to matlab .mat:
from scipy.io import savemat
savemat("mylesion_chacoconn_mean.mat",{"data":data.toarray()})
# (load in matlab with M=load("mylesion_chacoconn_mean.mat"); data=M.data;)

#sparse chacoconn outputs *CAN* be saved as sparse format in .mat files *ONLY IF* you convert them to np.double:
from scipy.io import savemat
savemat("mylesion_chacoconn_mean_sparse.mat",{"data":data.astype(np.double)})
# (load in matlab with M=load("mylesion_chacoconn_mean_sparse.mat"); data=M.data; or data=full(M.data); ... )

Load chacovol_*_allref.pkl or chacovol_*_allref_denom.pkl files and save as matlab .mat

#These files contain SPARSE matrices, so we need to either convert them to dense:
import pickle
import numpy as np
from scipy.io import savemat
data = pickle.load(open("mylesion_chacovol_fs86subj_allref.pkl","rb"))
savemat("mylesion_chacovol_fs86subj_allref.mat",{"allref":data.toarray()})
# (load in matlab with M=load("mylesion_chacovol_fs86subj_allref.mat"); allref=M.allref;)

#or save as sparse after converting to np.double:
savemat("mylesion_chacovol_fs86subj_allref_sparse.mat",{"allref":data.astype(np.double)})
# (load in matlab with M=load("mylesion_chacovol_fs86subj_allref.mat"); allref=M.allref; or allref=full(M.allref); ...)

Load chacoconn_*_allref.pkl or chacoconn_*_allref_denom.pkl files and save as matlab .mat

#These files contain LISTS of SPARSE matrices, so we need to either
#a) convert them all to dense and stack them into 3D:
import pickle
import numpy as np
from scipy.io import savemat
data = pickle.load(open("mylesion_chacoconn_fs86subj_allref.pkl","rb"))
allref=np.stack([x.toarray() for x in data], axis=2) #creates an 86x86x420 3D matrix
savemat("mylesion_chacoconn_fs86subj_allref.mat",{"allref":allref})
# (load in matlab with M=load("mylesion_chacoconn_fs86subj_allref.mat"); allref=M.allref;)

#or b) save as a cell array of of sparse matrices to save space, but you MUST convert to np.double:
allref=[x.astype(np.double) for x in data]
savemat("mylesion_chacoconn_fs86subj_allref_sparse.mat",{"allref":allref})
# (load in matlab with M=load("mylesion_chacoconn_fs86subj_allref.mat"); allref=M.allref; allref_subj1=M.allref{1}; allref_subj1_full=full(M.allref{1}); ...)

Load chacoconn_resXmm_mean.pkl file for VOXEL-PAIRWISE output

#chacoconn_mean is a VOXELSxVOXELS pairwise matrix, where chacoconn_mean[i,:] represents 
# a VOLUME of disconnectivity to from voxel i to all other voxels
import pickle
import numpy as np
import nibabel as nib
from nilearn import plotting

#read the corresponding chacovol_resXmm_mean.nii.gz output which defines the output dimensions
Vref = nib.load("mylesion_chacovol_res5mm_mean.nii.gz")

data = pickle.load(open("mylesion_chacoconn_res5mm_mean.pkl","rb"))

#for this demonstration, find the voxel with the largest total disconnectivity, convert that row
#of the chacoconn output to a volume, and visualize that volume
data_sum=np.sum(data,axis=0)+np.sum(data,axis=1).T #data is upper triangular so we need to sum across both dimensions
voxel_index=np.argmax(data_sum)
newdata=data[voxel_index,:]+data[:,voxel_index].T
Vnew=nib.Nifti1Image(np.reshape(newdata.toarray(),Vref.shape),affine=Vref.affine, header=Vref.header)

#convert voxel index to coordinates we can include the position in the output filename
voxel_ijk=np.unravel_index(voxel_index,Vref.shape) #convert to (i,j,k) coords
voxel_xyz=(Vref.affine @ np.array(voxel_ijk+(1,)).T)[:3] #can also convert to x,y,z mm using Vref affine
voxel_ijk_string="%d_%d_%d" % (voxel_ijk)

#save glassbrain image of this volume using nilearn plotting tools
plotting.plot_glass_brain(Vnew,output_file="mylesion_chacoconn_res5mm_voxel_%s.png" % (voxel_ijk_string),colorbar=True)

#save volume as nii.gz so you can view it in nifti-viewing software (eg: fsleyes)
nib.save(Vnew,"mylesion_chacoconn_res5mm_voxel_%s.nii.gz" % (voxel_ijk_string))

#just for fun, save a second volume with JUST that seed voxel highlighted so we can overlay it later
cursordata=np.zeros(data.shape[0])
cursordata[voxel_index]=1
Vcursor=nib.Nifti1Image(np.reshape(cursordata,Vref.shape),affine=Vref.affine, header=Vref.header)
nib.save(Vcursor,"mylesion_chacoconn_res5mm_voxel_%s_cursor.nii.gz" % (voxel_ijk_string))

Website usage

  • Coming soon

Requirements

  • No software is required to submit lesion masks through our web interface. To process outputs or to run portions of this this pipeline on your own, you will need the following:
    • Python 3.6+
    • NumPy
    • SciPy
    • NiBabel
    • Nilearn >= 0.6.0 (needed mainly for visualization)

Details of tractography database

  • Anatomical and diffusion data preprocessed by HCP using Minimal Processing Pipeline (Glasser 2013)
    • 3T diffusion MRI collected at 1.25mm resolution, 3 shells (b=1000,2000,3000), 90 directions per shell (See: HCP 3T Protocol)
    • Gradient nonlinearity correction, EPI distortion correction, eddy current correction, motion correction, rigid-body registration to subject anatomy
  • MRtrix3 was used to estimate a voxelwise multi-shell, multi-tissue constrained spherical deconvolution (CSD) model and then compute whole brain tractography for each HCP subject
  • 3D streamline coordinates were then transformed into discretized 1mm MNI space
    • Uses the warp file from HCP $SUBJID/MNINonLinear/xfms/standard2acpc_dc.nii.gz (Note this is the MNI->T1 volume warp that is used to map streamlines from T1->MNI)
    • Round (x,y,z) coordinates to the nearest integer, and for each subject create a (7M voxels)x(5M streamlines) binary sparse matrix describing which of the 7M voxels (182*218*182=7,221,032) each of the 5M streamlines passes through
  • This set of 420 7Mx5M sparse matrices can be used to compute ChaCo scores, but would require downloading the entire 700GB database every single time we run the tool. Instead, we divide the sparsemats into 10x10x10 voxel "chunks", where each chunk file contains the [420*1000 x 5M] sparse matrix of streamlines for all 420 subjects through that cube of MNI space. Thus, we only download the "chunks" that overlap the input mask to determine which streamlines intersect our lesion.

Parcellations

  • FreeSurfer86-subj: 86-region FreeSurfer Desikan-Killiany (DKT) cortical atlas with "aseg" subcortical regions(ie: aparc+aseg.nii.gz) Desikan 2006, Fischl 2002
    • This atlas includes the 68 cortical DKT regions + 18 subcortical (excluding brain-stem)
    • For this atlas, each of the 420 HCP reference subjects has their own subject-specific parcellation that we use when assigning streamlines to ROIs
  • FreeSurferSUIT111-subj: 111-region atlas with 68 DKT cortical + 16 aseg subcortical + 27 cerebellar subregions from the SUIT atlas Diedrichsen 2009
    • Like FreeSurfer86, this is a subject-specific parcellation.
    • SUIT ROIs are dilated and masked to fill all FreeSurfer cerebellum voxels
  • FreeSurferSUIT191-subj: 191-region atlas with 148 Destrieux (aparc.a2009s) cortical + 16 aseg subcortical + 27 cerebellar subregions from the SUIT atlas Destrieux 2010
    • Like FreeSurfer86, this is a subject-specific parcellation.
    • SUIT ROIs are dilated and masked to fill all FreeSurfer cerebellum voxels
  • CocoMMP438-subj: 438-region atlas combining parts of several atlases:
    • 358 cortical ROIs from the HCP multi-modal parcellation (Glasser 2016)
    • 12 subcortical ROIs from aseg, adjusted by FSL's FIRST tool (Patenaude 2011)
    • 30 thalamic nuclei from FreeSurfer7 Iglesias 2018 (50 nuclei merged down to 30 to remove the smallest nuclei, as with AAL3v1)
    • 12 subcortical nuclei from AAL3v1 Rolls 2020 (VTA L/R, SN_pc L/R, SN_pr L/R, Red_N L/R, LC L/R, Raphe D/M)
    • 26 cerebellar lobes from AAL3v1 (9 left, 9 right, 8 vermis)
    • Like FreeSurfer86-subj, this is a subject-specific parcellation
  • CocoMMPsuit439-subj: 439-region atlas from CocoMMP438, but with 27 SUIT cerebellar subregions instead of 26 from AAL3v1
    • Like FreeSurfer86, this is a subject-specific parcellation
  • CocoYeo143-subj: 143-region atlas combining Schaefer100 cortical ROIs + 16 aseg subcortical + 27 cerebellar subregions from SUIT Schaefer 2018
  • CocoYeo243-subj: 243-region atlas combining Schaefer200 cortical ROIs + 16 aseg subcortical + 27 cerebellar subregions from SUIT Schaefer 2018
  • CocoYeo443-subj: 243-region atlas combining Schaefer400 cortical ROIs + 16 aseg subcortical + 27 cerebellar subregions from SUIT Schaefer 2018
  • CocoLaus157-subj: 157-region atlas combining Lausanne116 (114 cortical ROIs) + 16 aseg subcortical + 27 cerebellar subregions from SUIT Daducci 2012
    • Corpus callosum is excluded from the Lausanne
    • Lausanne ROIs are re-ordered to group subparcels of the original Desikan-Killiany gyral parcels
  • CocoLaus262-subj: 262-region atlas combining Lausanne221 (219 cortical ROIs) + 16 aseg subcortical + 27 cerebellar subregions from SUIT Daducci 2012
    • Corpus callosum is excluded from the Lausanne221
    • Lausanne221 ROIs are re-ordered to group subparcels of the original Desikan-Killiany gyral parcels
  • CocoLaus491-subj: 491-region atlas combining Lausanne450 (448 cortical ROIs) + 16 aseg subcortical + 27 cerebellar subregions from SUIT Daducci 2012
    • Corpus callosum is excluded from the Lausanne450
    • Lausanne450 ROIs are re-ordered to group subparcels of the original Desikan-Killiany gyral parcels
  • FreeSurfer86-avg, FreeSurferSUIT111-avg, CocoMMP438-avg, CocoMMPsuit439-avg: Same regions as -subj but defined as a single group-level MNI volume
    • Each subject parcellation was mode-dilated by 1mm, then we computed the mode across all subjects
    • These averaged atlases are deprecated, replaced by subject-specific versions
  • AAL: 116-region Automated Anatomical Labeling atlas from Tzourio-Mazoyer 2002
  • AAL3: 166-region AAL3v1 atlas from Rolls 2020.
    • Updated from AAL to include 30 high-resolution thalamic nuclei and 12 subcortical nuclei
  • CC200: 200-region whole-brain cortical+subcortical parcellation from Craddock 2012
  • CC400: 400-region (actually 392) cortical+subcortical parcellation from Craddock 2012
  • Shen268: 268-region cortical+subcortical atlas from Shen 2013
  • Yeo7: 7-network CORTICAL-ONLY parcellation from Yeo 2011
  • Yeo17: 17-network CORTICAL-ONLY parcellation from Yeo 2011
  • Custom: Any 1mm MNI (182x218x182) parcellation volume
    • Note: Pipeline only considers sequential ROI label values. For example, a parcellation containing only label values [10,20,30,40] will produce an 4x1 output, or a 4x4 output in pairwise mode
  • See files in website/atlases/

FS86 FS111 FS191 CocoMMP438 CocoMMPSUIT439 CocoYeo143 CocoYeo243 CocoYeo443 CocoLaus157 CocoLaus262 CocoLaus491 AAL AAL3 CC200 CC400 Shen268 Yeo7 Yeo17