Dual-Echo Denoising with nilearn
#
Dual-echo fMRI leverages one of the same principles motivating multi-echo fMRI; namely, that BOLD contrast increases with echo time, so earlier echoes tend to be more affected by non-BOLD noise than later ones. At an early enough echo time (<5ms for 3T scanners), the signal is almost entirely driven by non-BOLD noise. When it comes to denoising, this means that, if you acquire data with both an early echo time and a more typical echo time (~30ms for 3T), you can simply regress the earlier echo’s time series out of the later echo’s time series, which will remove a lot of non-BOLD noise.
Additionally, dual-echo fMRI comes at no real cost in terms of temporal or spatial resolution, unlike multi-echo fMRI. For multi-echo denoising to work, you need to have at least one echo time that is later than the typical echo time, which means decreasing your temporal resolution, all else remaining equal. In the case of dual-echo fMRI, you only need a shorter echo time, which occurs in what is essentially “dead time” in the pulse sequence.
Dual-echo denoising was originally proposed in Bright & Murphy (2013).
import os
import matplotlib.pyplot as plt
from book_utils import regress_one_image_out_of_another
from myst_nb import glue
from nilearn import plotting
from repo2data.repo2data import Repo2Data
# Install the data if running locally, or point to cached data if running on neurolibre
DATA_REQ_FILE = os.path.join("../binder/data_requirement.json")
# Download data
repo2data = Repo2Data(DATA_REQ_FILE)
data_path = repo2data.install()
data_path = os.path.abspath(data_path[0])
---- repo2data starting ----
/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/repo2data
Config from file :
../binder/data_requirement.json
Destination:
./../data/ds006193/multi-echo-data-analysis
Info : Starting to download from datalad https://github.com/OpenNeuroDatasets/ds006193.git ...
It is highly recommended to configure Git before using DataLad. Set both 'user.name' and 'user.email' configuration variables.
[INFO] Attempting a clone into /home/runner/work/multi-echo-data-analysis/multi-echo-data-analysis/data/ds006193/multi-echo-data-analysis
[INFO] Attempting to clone from https://github.com/OpenNeuroDatasets/ds006193.git to /home/runner/work/multi-echo-data-analysis/multi-echo-data-analysis/data/ds006193/multi-echo-data-analysis
[INFO] Start enumerating objects
[INFO] Start counting objects
[INFO] Start compressing objects
[INFO] Start receiving objects
[INFO] Start resolving deltas
[INFO] Completed clone attempts for Dataset(/home/runner/work/multi-echo-data-analysis/multi-echo-data-analysis/data/ds006193/multi-echo-data-analysis)
install(error): /home/runner/work/multi-echo-data-analysis/multi-echo-data-analysis/data/ds006193/multi-echo-data-analysis (dataset) [No working git-annex installation of version >= 8.20200309. Visit http://handbook.datalad.org/r.html?install for instructions on how to install DataLad and git-annex.] [No working git-annex installation of version >= 8.20200309. Visit http://handbook.datalad.org/r.html?install for instructions on how to install DataLad and git-annex.]
---------------------------------------------------------------------------
CalledProcessError Traceback (most recent call last)
Cell In[1], line 14
12 # Download data
13 repo2data = Repo2Data(DATA_REQ_FILE)
---> 14 data_path = repo2data.install()
15 data_path = os.path.abspath(data_path[0])
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/repo2data/repo2data.py:106, in Repo2Data.install(self)
103 for key, value in self._data_requirement_file.items():
104 if isinstance(value, dict):
105 ret += [Repo2DataChild(value, self._use_server,
--> 106 self._data_requirement_path,key,self._server_dst_folder).install()]
107 # if not, it is a single assignment
108 else:
109 ret += [Repo2DataChild(self._data_requirement_file,
110 self._use_server, self._data_requirement_path, None, self._server_dst_folder).install()]
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/repo2data/repo2data.py:364, in Repo2DataChild.install(self)
362 os.makedirs(self._dst_path)
363 # Downloading with the right method, depending on the src type
--> 364 self._scan_dl_type()
365 # If needed, decompression of the data
366 self._archive_decompress()
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/repo2data/repo2data.py:332, in Repo2DataChild._scan_dl_type(self)
330 # if the source link has a .git, we use datalad
331 elif re.match(".*?\\.git$", self._data_requirement_file["src"]):
--> 332 self._datalad_download()
333 # or coming from google drive
334 elif re.match(".*?(drive\\.google\\.com).*?", self._data_requirement_file["src"]):
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/site-packages/repo2data/repo2data.py:263, in Repo2DataChild._datalad_download(self)
260 print("Info : Starting to download from datalad %s ..." %
261 (self._data_requirement_file["src"]))
262 try:
--> 263 subprocess.check_call(
264 ['datalad', 'install', self._dst_path, "-s", self._data_requirement_file["src"]])
265 except FileNotFoundError:
266 print("Error: datalad does not appear to be installed")
File /opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/subprocess.py:369, in check_call(*popenargs, **kwargs)
367 if cmd is None:
368 cmd = popenargs[0]
--> 369 raise CalledProcessError(retcode, cmd)
370 return 0
CalledProcessError: Command '['datalad', 'install', './../data/ds006193/multi-echo-data-analysis', '-s', 'https://github.com/OpenNeuroDatasets/ds006193.git']' returned non-zero exit status 1.
te1_img = os.path.join(
data_path,
"sub-04570/func/sub-04570_task-rest_echo-1_space-scanner_desc-partialPreproc_bold.nii.gz",
)
te2_img = os.path.join(
data_path,
"sub-04570/func/sub-04570_task-rest_echo-2_space-scanner_desc-partialPreproc_bold.nii.gz",
)
mask_img = os.path.join(
data_path, "sub-04570/func/sub-04570_task-rest_space-scanner_desc-brain_mask.nii.gz"
)
denoised_img = regress_one_image_out_of_another(te2_img, te1_img, mask_img)
fig, axes = plt.subplots(figsize=(16, 16), nrows=3)
plotting.plot_carpet(te2_img, axes=axes[0], figure=fig)
axes[0].set_title("First Echo (BAD)", fontsize=20)
plotting.plot_carpet(te1_img, axes=axes[1], figure=fig)
axes[1].set_title("Second Echo (GOOD)", fontsize=20)
plotting.plot_carpet(denoised_img, axes=axes[2], figure=fig)
axes[2].set_title("Denoised Data (GREAT)", fontsize=20)
axes[0].xaxis.set_visible(False)
axes[1].xaxis.set_visible(False)
axes[0].spines["bottom"].set_visible(False)
axes[1].spines["bottom"].set_visible(False)
fig.tight_layout()
glue("figure_dual_echo_results", fig, display=False)