Three-level analysis with FSL and ANTs in Nipype. Part 1.
In a series of posts, I plan to talk about how to run the three-level analysis with FSL and ANTs. We will use ANTs for registration, FSL for the analysis itself and nipype for putting everything together. I will be heavily utilizing a code from nipype examples, changing it’s when necessary. Again, this is not an original work, it is rather putting everything together and modifying when it’s appropriate.
To illustrate the analysis, I will use our study (Paniukov & Davis, Neuroimage, 2018) on category learning (the code is available here). It had two tasks in counterbalanced order (task 1 and task 2). Each task was scanned within four runs. In the code, I will put comments that explain what is going on and why I apply exceptions to some of my subjects or to the nipype original code.
Registration with ANTs
At this post, we will be doing registration with ANTs. The overall idea is to run the registration once and then run whatever number of analysis you want, applying this registration.
Preparation and preprocessing
First, let’s import some interfaces and libraries we will be using below.
Assign output type for FSL, so it will create files .nii.gz:
Now, let’s define some variables specific to our project. First, we will be needing the root project directory, where all our data are.
Then, we will also need a working directory, where we can put all temporary files, created by nipype. This directory should be specific for each analysis (alternatively, you can change the name of a workflow for each analysis), and may be deleted as soon as the analysis is finished. Besides, if your analyses use the same working directory and the node names overlap, most probably they will not run properly.
I prefer to input from the command line which subjects to run because it is easier to parallel your jobs.
Now, let’s define the workflow.
… and get a subject’s id:
Again, I am using just one subject per running job, so they are already in parallel. But in case you want to put all of them into this script and run in parallel (may the power of CPU be with you!), this is the place to do it (e.g., put ["Subject001","Subject002"] instead of [subj_list], and remove subj_list variable above.)
Here I use a function to get information for a specific subject. It may be different information such as ids for run numbers, condition information, etc. In the code below, some subjects had no run 1 (because of a visual spike, it was removed), and some had no last runs (did not finish the experiment).
Now, it is time to get all the files we need from the hard drive.
Ok, we have grabbed the files and now we can get the middle volume from each run for the functional to anatomical registration.
Define interfaces first:
Convert functional images to float representation. Since there can be more than one functional run we use a MapNode to convert each run.
Extract the middle volume of the run as the reference and define a function to return the 1 based index of the middle volume.
Register functionals to anatomical space
Estimate the tissue classes from the anatomical image.
Binarize the segmentation.
Calculate rigid transform from example_func image to anatomical image.
Now use BBR cost function to improve the transform.
Convert the BBRegister transformation to ANTS ITK format for further reuse.
We are done with the registration of functional files for now. We will use the affine matrix later to transform the example_func to standard space.
Register anatomical to standard space
You can find a nice crash course on the ANTs registration here. Oh, keep in mind that this example uses 12 CPU threads for a single subject registration. In case you want to run 25 subjects in this script in parallel, you should adjust reg.inputs.num_threads to the number your computer can handle.
Warp functionals to standard space
Strictly speaking, we will be warping only middle volume for each run (example_func) to standard space for quality assessment.
Concatenate the affine and ants transforms into a list.
Transform the example_func image, first to anatomical and then to target.
Save the data and run
We need to save all our data, don’t we?! Here we will save the warped anatomical image from ANTs registration, its inverse, both regular non-linear and inverse non-linear ANTs transform matrices, transformed example_func to standard space, functional to anatomical space matrices, functional to standard matrices, and the example_func itself just in case we will need it in the future (for computing betaseries in mvpa analysis, for example).
Shall we run it? For single CPU computer remove plugin='MultiProc', put reg.inputs.num_threads = 1 and do not try to run subjects in parallel!
Quality assessment
For the quality assessment, revisit all warped anatomical images and functional images transformed to standard space. For the functional images you can also use FSL’s slices in a terminal (in bash, not in python!) as:
*Reference: Paniukov, D., & Davis, T. (2018). The evaluative role of rostrolateral prefrontal cortex in rule-based category learning. NeuroImage, 166, 19-31.