How to use the fMRI data

What is available, where to download it, and how to use it with the analysis code in this repository.

What’s in the Figshare collection (fMRI)

The THINGS-data Figshare collection groups all modalities. For fMRI, the main items are:

  • Single-trial response estimates — ResponseData.h5, StimulusMetadata.csv, VoxelMetadata.csv per subject (trials × voxels, trial/image info, ROI labels, noise ceilings, pRF parameters).
  • Brain masks — Subject masks in T1w space for mapping results back to volumetric space.
  • Preprocessed BOLD, cortical surface maps, and other pipeline outputs (for re-running GLMs or advanced analyses).

Which download for which goal

Goal Download
Encoding models, ROI analysis, single-trial analyses Single-trial response estimates (required). Optionally brain masks for 3D/volumetric plots.
Volumetric maps, plotting on the brain Single-trial estimates + brain masks.
Re-run GLM or full preprocessing pipeline Raw fMRI (OpenNeuro ds004192). Preprocessed BOLD and other derivatives from the Figshare collection as needed.

Directory layout for ThingsmriLoader

For encoding, ROIs, and single-trial analyses, put the downloaded files under one root directory with this structure:

your_derivatives_root/
├── betas_csv/
│   ├── sub-01_ResponseData.h5
│   ├── sub-01_StimulusMetadata.csv
│   ├── sub-01_VoxelMetadata.csv
│   ├── sub-02_ResponseData.h5
│   └── ...
└── brainmasks/          (optional)
    ├── sub-01_space-T1w_brainmask.nii.gz
    ├── sub-02_space-T1w_brainmask.nii.gz
    └── ...

Unzip the single-trial package so that betas_csv/ contains the per-subject files. If you download brain masks, place them in brainmasks/. Then in Python:

from thingsmri.dataset import ThingsmriLoader

loader = ThingsmriLoader(thingsmri_dir="path/to/your_derivatives_root")
responses, stimdata, voxdata = loader.load_responses("01")
# Optional: loader.get_brainmask("01") for volumetric plotting

Analysis code and examples

See MRI/README.md for installation and module overview. Example notebooks in MRI/notebooks/: