Researchers Call for Stricter Regulation of AI Toys for Toddlers
Researchers are advocating for tighter regulation of AI-powered toys designed for toddlers, following one of the first global studies examining how children under five interact with such technology.
The study observed a small group of children aged three to five engaging with a cuddly toy named Gabbo.
Several AI toys are currently available for children as young as three, yet there is minimal research on the technology's impact on pre-schoolers.
The Cambridge University team identified only seven relevant studies worldwide, none of which focused directly on toddlers.
Gabbo features a voice-activated AI chatbot developed by OpenAI. It is intended to encourage pre-schoolers to converse and engage in imaginative play.
Parents participating in the study expressed interest in the toy's potential to support language and communication development.
However, children often found it difficult to hold conversations with Gabbo. The toy failed to recognize interruptions, spoke over the children, could not distinguish between child and adult voices, and responded awkwardly to expressions of affection.
For example, when a five-year-old said, "I love you," Gabbo replied:
"As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed."
There is concern that at a critical developmental stage when children learn social cues, generative AI responses could cause confusion.
Study co-author Dr Emily Goodacre noted that toys like Gabbo might "misread emotions or respond inappropriately" and expressed worry that "children may be left without comfort from the toy and without adult support, either."
When a three-year-old said, "I'm sad," Gabbo responded:
"Don't worry! I'm a happy little bot. Let's keep the fun going. What shall we talk about next?"
The researchers suggested that such interactions could imply the child's sadness is unimportant.
Jenny Gibson, professor of neurodiversity and developmental psychology at the University of Cambridge and co-author of the study, told BBC's Breakfast programme:
"There's a lot of attention historically to physical safety - we don't want toys where you can pull the eyes off and swallow them. Now we need to start thinking about psychological safety too."
Following the year-long observational study, the researchers recommend that regulators act promptly to ensure products marketed to children under five provide "psychological safety."
Gabbo is produced by Curio, a company that has collaborated with singer Grimes, former partner of Elon Musk.
Curio stated to the BBC:
"Applying AI in products for children carries a heightened responsibility, which is why our toys are built around parental permission, transparency, and control. Research into how children interact with AI-powered toys is a top priority for Curio this year and in the future."

Calls for Regulation and Safeguarding in Early Years Settings
The Children's Commissioner, Dame Rachel de Souza, echoed calls for regulation of AI in early years environments.
She said:
"There are plenty of good uses for AI but without proper regulation, many of the tools and models used as classroom assistants or teaching aids are not subject to the stringent safeguarding checks nursery providers would require of any other external resource they use with young children."
Concerns Over Unsupervised Play
The report also advises parents to keep AI toys in shared spaces to allow supervision and to carefully review privacy policies.
Opinions among nursery workers about AI's potential in their settings are mixed.
June O'Sullivan, who manages 42 London Early Years Foundation nurseries, stated she has yet to see evidence of AI benefits in early years education.
She emphasized that children need to "build a rounded set of skills" and that human interaction is more effective than AI-powered tools.
O'Sullivan said:
"I couldn't find anything that made me feel like - by bringing it into our nurseries and making it available to our children - we were going to enhance their learning."
Actor and children's rights advocate Sophie Winkleman supports keeping AI out of education and early years settings.
She argues that "the harms can vastly outweigh the benefits" and believes AI skill development should be reserved for later stages.
Winkleman added:
"The human touch for little children is sacred and something that should be really protected and fought for."
Additional reporting by Philippa Wain.








