Data-driven approaches to cognitive and computational auditory neuroscience


If the auditory system is one thing, it’s wild. Especially when compared to other primary sensory counterparts like vision and motor. In addition to a tremendous inter-individual variability regarding structure and function, the auditory system display a prominent stimulus and task dependency. All of this is not even including the various non-linear transformations that seemingly occur along the auditory pathway…Going down this rabbit hole, this line of our research is concerned with how large scale data-driven approaches can unravel processes that drive and guide the transformation from sound waves to abstract representations. To this end, we investigate three main questions using a versatile set of complementary methods:

  1. Can machine learning and connectivity analyses shed light on functional principles of the auditory cortex and pathway? Is the processing of all perceived sounds bound to the same processing steps and if not, when and why do they diverge?

  2. To what extent can we utilize multimodal data integration methods to test the structure-function relationship of brain regions related to auditory processing? Is there a correlation between structural and functional indicators and do primates have comparable expressions?

  3. Is it possible to leverage naturalistic experiments and artificial neuronal networks to assess underlying computations of sound processing and test their generalizability? Can meta-analyses help us to define commonalities and sources of variation?

Within all of the above, we make us of large scale public datasets like HCP and UK Biobank to enable predictive models and assess their stability, as well as work together with the Neuroscout and Neuromod teams.

People

Peer Herholz
Postdoctoral researcher