It is being embedded not only on social media but on platforms related to education, health and other sectors.
There have been concerns about how these technologies affect children, and this is justified.
In response, the United Nations Children Fund (UNICEF) has released an updated guidance on AI and children, calling on governments and the private sector to put children’s rights at the centre of AI policies, design and deployment.
This guidance, which aims at raising awareness on how AI systems can impact the rights of children (either positively or negatively), is rooted in the United Nations Convention on the Rights of the Child (UNCRC). Children are not passive users of technology.
As such, their safety, dignity, development and participation must be protected in this digital age.
While UNICEF addresses governments and the private sector, parents, schools, and caregivers are the first line of protection and guidance for children in AI-shaped environments.
Children encounter AI long before they understand it through tablets, phones, learning apps, social media feeds, search engines and even toys.
Adults, therefore, act as key mediators, advocates and educators.
Parents and caregivers are responsible for monitoring the safety of AI-powered tools children use at home and school.
This includes understanding age limits, content filters, recommendation systems, and potential risks such as exposure to harmful content, manipulation, or excessive screen engagement.
Schools share this responsibility by vetting digital learning platforms and ensuring they meet child-safety standards.
Every time a child signs up for an app, platform, or online classroom, their data is at stake.
Parents and schools must ask critical questions such as what data is being collected, who can access the data and how long it is stored?
Caregivers also play a vital role in teaching children basic digital privacy habits, such as not oversharing personal information.
To ensure fairness in the use of AI tools in schools, parents and educators must encourage institutions to adopt inclusive technologies that serve all students without posing a disadvantage to any.
These adults also have a duty to ensure transparency and understanding of AI tools.
Children should not be exposed to AI tools that will confuse or mislead them.
Parents and teachers should explain in age-appropriate language how to use AI without fear or the blind acceptance of automated decisions.
A lot of parents are yet to understand that not every “smart” tool is good for a child.
This is why parents, caregivers and schools must evaluate whether the AI tools they employ truly support learning, well-being and healthy development, rather than simply increasing convenience or engagement.
The child’s best interest should always outweigh trends or commercial pressure.
In the face of all these facts, digital literacy is now a life skill, not a luxury.
We must, therefore, prepare our children for an AI-driven world by teaching them critical thinking, encouraging questions about technology, and helping them understand the limits and risks of AI.
UNICEF’s final recommendation about creating a supportive environment can be carried out through open conversations with the children, ensuring clear boundaries, encouraging shared values, and ensuring effective collaboration between parents, teachers and caregivers.
When adults are informed and engaged, children are less vulnerable and more empowered.
Parents do not need to be highly tech-literate to protect their children.
They only need support, simple tools, and shared responsibility. Parents must be emotionally available and ready to demand safe systems.
The Writer is a Child development expert/ Fellow of the Zero-To-Three Academy, USA. E-mail: nanaesi.gaisie@wellchildhaven.com