Exploratory action

BrainGPT

Transforming Transformers into Cognitive Language Models
Transforming Transformers into Cognitive Language Models

In the wake of the emergence of large-scale language models such as ChatGPT, the BrainGPT project is at the forefront of research in Artificial Intelligence and Computational Neuroscience. While these models are remarkably efficient, they do not reflect how our brain processes and learns language. BrainGPT takes up the challenge by focusing on the development of models more faithful to human cognitive functioning, inspired by data from brain activity during listening or reading. The ambition is to create more efficient models, less reliant on intensive computations and massive volumes of data. BrainGPT will open new perspectives on our understanding of language and cognition.

Inria teams involved
MNEMOSYNE

Contacts

Xavier Hinaut

Scientific leader