Babies as young as two-months-old can categorise objects in their brains, scientists at Trinity College Dublin, Queen's University Belfast and Stanford University have discovered.
This is much earlier than previously thought, and is revealed in research assisted by the Coombe and Rotunda Hospitals in Dublin.
Brain imaging and artificial intelligence models were used in the study of 130 two-month-old infants.
Lying on a beanbag and wearing sound-cancelling headphones, the babies were shown bright, colourful images for 15-20 minutes.
The scientists then used functional MRI to measure how their brains responded to pictures of 12 common visual categories such as a cat, bird, rubber duck, shopping trolley and tree.
Artificial intelligence models were then used to characterise how the babies’ brains represented the different visual categories.
The study has been just published in the journal Nature Neuroscience.
It was led by a team from Trinity College Institute of Neuroscience and the School of Psychology.
Dr Anna Truzzi, who is now based at Queen’s University Belfast, is a co-author on the paper.
'Study provides new foundational knowledge', researchers say
Her daughter Maeve took part in the study when she was just two months old.
"As a mother and a researcher, it is fascinating to find out more about what a very young baby can see and make sense of!" Dr Truzzi said.
"Until recently, we could not reliably measure how specific areas of the infant brain interpreted visual information.
"By combining AI and neuroimaging, our study offers a very unique insight, which helps us to understand much more about how babies learn in their first year of life.
"The first year is a period of rapid and intricate brain development.
"This study provides new foundational knowledge which will help guide early-years education, inform clinical support for neurodevelopmental conditions and inspire more biologically-grounded approaches in artificial intelligence."
Dr Cliona O’Doherty is the lead author on the study, and conducted her research while in Trinity’s Cusack Lab. She is now based in Stanford University.
"Parents and scientists have long wondered what goes on in a baby’s mind and what they actually see when they view the world around them," she said
"This research highlights the richness of brain function in the first year of life."
"Although at two months, infants’ communication is limited by a lack of language and fine motor control, their minds were already not only representing how things look, but figuring out to which category they belonged.
"This shows that the foundations of visual cognition are already in place from very early on and much earlier than expected."
'Babies learn much more quickly than AI'
Rhodri Cusack from the Thomas Mitchell Professor of Cognitive Neuroscience at Trinity’s School of Psychology and Trinity College Institute of Neuroscience was team leader on the project.
"This study represents the largest longitudinal study with functional magnetic resonance imaging of awake infants," she said.
"The rich dataset capturing brain activity opens up a whole new way to measure what babies are thinking at a very early age.
"It also highlights the potential for neuroimaging and computational models to be used as a diagnostic tool in very young infants.
"Babies learn much more quickly than today’s AI models and by studying how they do this, we hope to inspire a new generation of AI models that learn more efficiently, so reducing their economic and environmental costs."
Prof Eleanor Molloy, a neonatologist from Children’s Health Ireland and co-author, emphasised the potential of the study’s high success rates for awake neuroimaging.
"There is a pressing need for greater understanding of how neurodevelopmental disorders change early brain development, and awake fMRI has considerable potential to address this."
The research was conducted with the assistance of funding from the European Research Council and Research Ireland.