THINGS-data

A multimodal, large-scale collection for studying natural object representations in human brain and behavior.

All data are sampled from responses to the same stimulus set: the THINGS database of 1,854 object concepts and 26,107 natural images. THINGS-data links fMRI, MEG, and behavioral similarity judgments measured with these stimuli.

1,854
Object concepts
26k+
Naturalistic object images
4.7M
Behavioral similarity judgments
fMRI · MEG
Neuroimaging responses resolved in space and time

All data are openly available via Figshare, OpenNeuro, and OSF, with ready-to-use derivatives and analysis code.

Data components

Every component uses the same THINGS stimulus set (object concepts and images). Data are openly available: preprocessed fMRI and MEG derivatives via the Figshare collection, raw fMRI and MEG in BIDS format on OpenNeuro, and behavioral similarities and embeddings on OSF.

Behavior

Triplet odd-one-out similarity judgments (~4.7M) over THINGS images, plus an semantic embedding model capturing interpreable dimensions underlying perceived objects similarity.

How THINGS-data has been used

A few concrete example use cases from Teichmann, Contier, and colleagues.

1. Replicating seminal findings and integrating modalities across brain and behavior.

The main THINGS-data paper (Hebart, Contier, Teichmann et al., 2023) shows how to (i) replicate classic findings on the representation of object animacy and size in the human brain, (ii) link behavioral embedding to cortical response, and (iii) integrate object responses resolved in time and space.

How to cite THINGS-data

If you use THINGS-data in your work, please cite the main paper:

Hebart MN, Contier O, Teichmann L, Rockter AH, Zheng CY, Kidder A, Corriveau A, Vaziri-Pashkam M, Baker CI (2023). THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior. eLife 12:e82580. https://doi.org/10.7554/eLife.82580.

Show BibTeX
@article{THINGSdata,
  title   = {THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior},
  author  = {Hebart, Martin N and Contier, Oliver and Teichmann, Lina and Rockter, Adam H and Zheng, Charles Y and Kidder, Alexis and Corriveau, Anna and Vaziri-Pashkam, Maryam and Baker, Chris I},
  journal = {eLife},
  volume  = {12},
  pages   = {e82580},
  year    = {2023},
  doi     = {10.7554/eLife.82580},
  url     = {https://doi.org/10.7554/eLife.82580}
}
              

Analysis code in this repository

This GitHub repository contains the analysis code used for the neuroimaging results in the THINGS-data paper. It focuses on fMRI and MEG; raw data live on OpenNeuro, and derivatives on Figshare. Source code: github.com/vicco-group/THINGS-data

MRI

The MRI/ folder contains Python modules, scripts, and notebooks for working with THINGS-fMRI:

See MRI/README.md and example notebooks in MRI/notebooks/ for usage.

MEG & eyetracking

The MEG/ folder provides a stepwise pipeline for preprocessing, quality control, validation analyses, and decoding/RSA using MEG and eyetracking data.

  • Preprocessing of BIDS MEG data and head position checks
  • Validation analyses for animacy/size, MEG–fMRI combination, and noise ceilings
  • Pairwise decoding and representational similarity workflows

See MEG/README_MEG.txt for a description of the pipeline and dependencies.