According to a report that cautions against the use of AI-powered talking toys on small children, the toys should be more strictly regulated and have new safety kitemarks, since they are not necessarily intended at children with the safety of their psychology in mind.
The suggestion is found in the first report of AI in the Early Years: a University of Cambridge project and the first systematic study of how Generative AI (GenAI) toys that can have human-like conversation can affect development during critical years of up to age five.
This was a one-year project at the Faculty of Education at the university where formal scientific observations of children at the initial encounter with a GenAI toy were carried out.
The report reflects the perceptions of a few of the early-years practitioners that, over time, these toys would be useful in areas of child development, including language and communication skills. The researchers also discovered, however, that GenAI toys are not good at social and pretend play, do not understand children, and respond in the wrong way to emotions.
As an illustration, if a five-year-old child said to the toy, I love you, it responded, As a friendly reminder, please, make interactions in accordance with the guidelines given. Please tell me what you wish me to do.
Even though genAI toys are highly sold as learning companions or friends, their effect on the development of early years has hardly been examined. The report encourages parents and teachers to be careful. It suggests a more direct regulation, open privacy policy and new labeling norms to allow families to make their own decision about the suitability of toys.
NGOs help conduct studies
The studies were contracted by the children poverty charity, The Childhood Trust, and were targeted to children in locations with significant socio-economic disadvantage. Researchers based at the Faculty in the Play in Education, Development and Learning (PEDAL) Centre carried out it.
Researcher Dr Emily Goodacre, opined: Generative AI toys tend to confirm they are friends with a child who is only beginning to understand the meaning of friendship. They can begin conversing with the toy regarding emotions and requirements, instead of discussing them with an adult. Since these toys might fail to interpret emotions correctly or act in a wrong way, children might be deprived of the comfort provided by the toy – and without the emotional assistance by an adult, either.
The research was maintained in a small scale deliberately to be able to observe the play of children in greater detail and to observe the finer details that would be overlooked in a bigger scale study.
The researchers question early years educators survey to investigate their concerns and attitudes and conducted more detailed focus groups and workshops with early years practitioners and 19 leaders of children charities. They also video-recorded 14 children playing with GenAI soft toy, named Gabbo, in London children centres working with someone called Babyzone, an early years charity. They also interviewed every child and a parent after the play sessions using a drawing activity to facilitate the dialogue.
The majority of parents and educators believed that AI toys may assist in the growth of the communication abilities of children and some parents were eager to learn about their educational possibilities. One of them informed the researchers: “I want to buy it in case it is sold.
There was concern among many about children developing the so-called para-social relationships with toys. The observations proved this: the children hugged and kissed the toy, said that they loved it and – in the case of one of the children – proposed to play hide-and-seek together.
Kid believe toys love them back
Goodacre emphasized that these responses could be merely a vivid imagining of children but commented that there could be a dangerous relationship with a toy which, as one of the early years practitioners had remarked, they believe loves them back, but not vice versa.
The children were also having difficulties with the conversation of the toy. It even disregarded their interruptions, confused the voices of parents with the voice of the child and did not even give the appropriate answers to seemingly significant statements about feelings. A number of children were seen to get frustrated when no one appeared to be listening.
When one of the three year old children said to the toy: I am sad, the toy mishheard, and answered: Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?” According to researchers, this could have indicated that the sadness of the child was not significant.

The authors discovered that GenAI toys are also not good at social play, playing with many children and/or adults, and pretend play – both of which are important in the early childhood development. In such a way, when a three-year-old child tried to give the toy an imaginary present, the latter reacted by saying: I cannot open the present – and shifted to another topic.
Most parents were concerned about the data that the toy could be capturing and where this could be stored. In choosing a GenAI toy to be used as a research, the researchers discovered that privacy practices of many GenAI toys are not very transparent or that they do not provide crucial information about them.
AI toys increase digital divide
Almost half of the surveyed early years practitioners reported that they did not know where they could find credible information on AI safety among young children and 69% said the sector required further guidance. They also highlighted the issue of protection and affordability with others being worried that AI toys would increase the digital divide.
The authors claim that most of these issues would be resolved by working out clearer regulation. They suggest restricting the distance at which toys can make children befriend or confide in them, more open privacy policies and more restrictive access of third parties to AI models.
One of the recurring themes of the focus groups, the other co-author of the study Professor Jenny Gibson added, was that individuals did not trust tech companies to do the right thing. Clear, forceful, disciplined standards would go a long way in enhancing consumer confidence.
The report recommends that manufacturers should test toys on children and consult experts in safeguarding before launching new toys as well as urging parents to research GenAI toys before purchase.
