Skip to content

Latest commit

 

History

History
24 lines (10 loc) · 1.58 KB

README.md

File metadata and controls

24 lines (10 loc) · 1.58 KB

Nothing compares to you - music genre perception similarities across modalities in humans and machines

This is the repository for the multimodal project "Nothing compares to you - music genre perception similarities across modalities in humans and machines" which focused on the perception of music genres and their similarities in both biological and artificial networks. In more detail, participants were presented with a diverse set of music excerpts from 20 different genres while they underwent fMRI or EEG, furthermore arranging the same excerpts in a multiarrangement task. In addition to extracting acoustic features from the music excerpts, their respective cochleagram was processed by a pre-trained DNN tasked with predicting the respective genre. Furthermore, a broad set of general, auditory and music related information was assessed from the participants.

The project comprised 5 distinct parts:

The dataset

Following open and reproducible (neuro-)science practices, the entire dataset was made publicly available here and is described in detail here. The dataset is additionally version controlled through DataLad and the corresponding analysis steps outlined in the dataset paper are reproducible through the respective DataLad functionality. The steps are broadly summarized at the project's protocol.io page and the code is available here.

Spatial aspects using fMRI

Temporal aspects using EEG

Behavioral aspects using IMDS

Computational aspects using DNN

Credit: Mirjam Schneider, Klara Brinkmann, Adina Wagner, JB Poline & Peer Herholz