%pylab inline
Welcome to pylab, a matplotlib-based Python environment [backend: module://IPython.kernel.zmq.pylab.backend_inline]. For more information, type 'help(pylab)'.
import numpy as np
import matplotlib.pyplot as plt
Make sure you have downloaded the MNI templates archive from http://www.bic.mni.mcgill.ca/~vfonov/icbm/2009/mni_icbm152_nlin_asym_09a_nifti.zip
Unzip into the directory containing this notebook, so you have a directory like this:
import os
MNI_PATH = 'mni_icbm152_nlin_asym_09a'
os.listdir(MNI_PATH)
['mni_icbm152_wm_tal_nlin_asym_09a.nii', 'mni_icbm152_t1_tal_nlin_asym_09a.nii', 'mni_icbm152_t2_tal_nlin_asym_09a.nii', 'mni_icbm152_csf_tal_nlin_asym_09a.nii', 'mni_icbm152_pd_tal_nlin_asym_09a.nii', 'mni_icbm152_t1_tal_nlin_asym_09a_face_mask.nii', 'mni_icbm152_t1_tal_nlin_asym_09a_mask.nii', 'mni_icbm152_t1_tal_nlin_asym_09a_eye_mask.nii', 'mni_icbm152_gm_tal_nlin_asym_09a.nii']
import nibabel as nib
t2_fname = os.path.join(MNI_PATH, 'mni_icbm152_t2_tal_nlin_asym_09a.nii')
t2_img = nib.load(t2_fname)
plt.imshow(t2_img.get_data()[:, :, 90], cmap="gray")
<matplotlib.image.AxesImage at 0x3e85290>
We get an example structural from the Haxby dataset:
DATA_PATH = os.path.join(os.path.expanduser('~'), 'data', 'ds105')
os.listdir(DATA_PATH)
['study_key.txt', 'release_history.txt', 'task_key.txt', 'models', 'references.txt', 'sub005', 'license.txt', 'sub004', 'sub003', 'scan_key.txt', 'sub001', 'sub006', 'README', 'sub002']
anat_fname = os.path.join(DATA_PATH, 'sub001', 'anatomy', 'highres001.nii.gz')
anat_img = nib.load(anat_fname)
plt.imshow(anat_img.get_data()[:, :, 128], cmap="gray")
<matplotlib.image.AxesImage at 0x50b74d0>
The affines give some match for the images. We can use these with the nipy resample
function:
import nipy.algorithms.registration as nar
# Make a transform that does nothing to use the image affines only to register the datasets
empty_transform = nar.Affine()
empty_transform.param
array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])
anat_resampled = nar.resample(anat_img, empty_transform, t2_img)
fig, axes = plt.subplots(1, 2)
fig.set_size_inches(10.5,18.5)
axes[0].imshow(anat_resampled.get_data()[:, :, 90], cmap="gray")
axes[1].imshow(t2_img.get_data()[:, :, 90], cmap="gray")
<matplotlib.image.AxesImage at 0x682fd10>
Let's try and match the images using affine parameters
register_obj = nar.HistogramRegistration(from_img = t2_img, to_img = anat_img, similarity='cc')
transform = register_obj.optimize('affine')
Initial guess... translation : [ 0. 0. 0.] rotation : [ 0. 0. 0.] scaling : [ 1. 1. 1.] pre-rotation: [ 0. 0. 0.] Optimizing using fmin_powell translation : [ 0.381966 2.618034 -13.99340389] rotation : [-0.00618034 0.00381966 0.00051036] scaling : [ 0.80490673 0.85790459 0.98394986] pre-rotation: [ 0.01 -0.00618034 -0.00082465] cc = 0.485196324981 translation : [ 0.29907506 3.618034 -12.99340389] rotation : [-0.02236068 0.00763932 0.00015709] scaling : [ 0.84486806 0.8526188 0.97788747] pre-rotation: [ 0.00697268 0.00381966 -0.01700499] cc = 0.498781644116 translation : [ 0.01803112 4.618034 -10.37536989] rotation : [-0.02854102 0.00145898 0.00024264] scaling : [ 0.84810133 0.85032054 0.93026321] pre-rotation: [-0.00679614 0.0027342 -0.01318533] cc = 0.510940778988 translation : [-0.8758703 5.58352867 -1.78751664] rotation : [ 0.09428999 0.03936903 -0.0057167 ] scaling : [ 0.85978984 0.84764309 0.81768701] pre-rotation: [-0.0585486 -0.00625045 -0.00281184] cc = 0.559584561113 translation : [-0.57145243 5.24142355 1.38644404] rotation : [ 0.22286652 0.08653025 -0.00565998] scaling : [ 0.86141546 0.85552463 0.80649954] pre-rotation: [-0.03616748 -0.0177068 -0.00450105] cc = 0.587131403322 translation : [-0.15885378 4.34415743 7.08318193] rotation : [ 0.3481862 0.11610085 -0.01178017] scaling : [ 0.85494902 0.8718448 0.79352719] pre-rotation: [-0.06945918 -0.02514347 -0.01212177] cc = 0.605477323115 translation : [ 0.20393626 5.42209146 7.10093287] rotation : [ 0.34936034 0.11768747 -0.02332207] scaling : [ 0.85428722 0.88051781 0.79012628] pre-rotation: [-0.08053052 -0.03156327 -0.00800943] cc = 0.611674423577 translation : [ 0.50143358 6.59281281 6.26745388] rotation : [ 0.32875452 0.12152869 -0.0367063 ] scaling : [ 0.85167579 0.88830315 0.79000533] pre-rotation: [-0.08565537 -0.02641654 -0.00431845] cc = 0.614238316669 Optimization terminated successfully. Current function value: -0.614238 Iterations: 8 Function evaluations: 643
anat_better_resampled = nar.resample(anat_img, transform, t2_img)
fig, axes = plt.subplots(1, 2)
fig.set_size_inches(10.5,18.5)
axes[0].imshow(anat_better_resampled.get_data()[:, :, 90], cmap="gray")
axes[1].imshow(t2_img.get_data()[:, :, 90], cmap="gray")
<matplotlib.image.AxesImage at 0x6842110>
Can we do better than this?