Synergistic information supports modality integration and flexible learning in neural networks solving multiple tasks

PLoS Comput Biol. 2024 Jun 3;20(6):e1012178. doi: 10.1371/journal.pcbi.1012178. eCollection 2024 Jun.

Abstract

Striking progress has been made in understanding cognition by analyzing how the brain is engaged in different modes of information processing. For instance, so-called synergistic information (information encoded by a set of neurons but not by any subset) plays a key role in areas of the human brain linked with complex cognition. However, two questions remain unanswered: (a) how and why a cognitive system can become highly synergistic; and (b) how informational states map onto artificial neural networks in various learning modes. Here we employ an information-decomposition framework to investigate neural networks performing cognitive tasks. Our results show that synergy increases as networks learn multiple diverse tasks, and that in tasks requiring integration of multiple sources, performance critically relies on synergistic neurons. Overall, our results suggest that synergy is used to combine information from multiple modalities-and more generally for flexible and efficient learning. These findings reveal new ways of investigating how and why learning systems employ specific information-processing strategies, and support the principle that the capacity for general-purpose learning critically relies on the system's information dynamics.

MeSH terms

  • Brain* / physiology
  • Cognition* / physiology
  • Computational Biology
  • Humans
  • Learning* / physiology
  • Models, Neurological*
  • Neural Networks, Computer*
  • Neurons / physiology