THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior

Understanding object representations requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. Here, we present THINGS-data, a multimodal collection of large-scale neuroimaging and behavioral datasets in humans, comprising den...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:eLife Ročník 12
Hlavní autoři: Hebart, Martin N, Contier, Oliver, Teichmann, Lina, Rockter, Adam H, Zheng, Charles Y, Kidder, Alexis, Corriveau, Anna, Vaziri-Pashkam, Maryam, Baker, Chris I
Médium: Journal Article
Jazyk:angličtina
Vydáno: England eLife Science Publications, Ltd 27.02.2023
eLife Sciences Publications, Ltd
eLife Sciences Publications Ltd
Témata:
ISSN:2050-084X, 2050-084X
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Understanding object representations requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. Here, we present THINGS-data, a multimodal collection of large-scale neuroimaging and behavioral datasets in humans, comprising densely sampled functional MRI and magnetoencephalographic recordings, as well as 4.70 million similarity judgments in response to thousands of photographic images for up to 1,854 object concepts. THINGS-data is unique in its breadth of richly annotated objects, allowing for testing countless hypotheses at scale while assessing the reproducibility of previous findings. Beyond the unique insights promised by each individual dataset, the multimodality of THINGS-data allows combining datasets for a much broader view into object processing than previously possible. Our analyses demonstrate the high quality of the datasets and provide five examples of hypothesis-driven and data-driven applications. THINGS-data constitutes the core public release of the THINGS initiative ( https://things-initiative.org ) for bridging the gap between disciplines and the advancement of cognitive neuroscience.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
These authors contributed equally to this work.
ISSN:2050-084X
2050-084X
DOI:10.7554/eLife.82580