Should parents draft an AI policy for their homes? South African parents weigh in – Firstgora.buzz

Should parents draft an AI policy for their homes? South African parents weigh in

Most toddlers know how to swipe before they can spell, which necessitates an AI policy for the home. Your seven-year-old has already watched their favourite content creator use AI. And your matric child? They are one generative prompt away from an essay they did not write.

Whenever I ask my five-year-old how she knows the concept she’s explaining to me, she tells me that she asked her Google. “The one with the sparkle thingy.” She’s referring to Gemini.

In early April 2026, a UK family reported being locked out of their Google accounts after a 14-year-old had an explicit conversation with Gemini Live AI.

Artificial intelligence has arrived in the family home just as we were blindly navigating a world ruled by social media, and parents don’t have any time to catch up. We’re long past questioning whether children in connected households across South Africa are using AI. They are. The question is whether families are being intentional about how.

‘I would never have called it an AI policy – but we have one’

Tinashe Venge, an award-winning writer, voice artist and father of a seven-year-old, is among the South African parents who have been building frameworks around AI use at home, even if they have never named them as such.

“Do I believe that a household should have an AI policy? Yes, I do,” Venge told The Citizen.

“Just because of the rapid rate at which AI has advanced and continues to advance, as well as the irresponsible use of AI that we have seen already in times gone by. There have been times where people have been duped or fooled by AI videos or AI images, and they believe that a certain person said or did something based on that.”

For Venge, the priority is media literacy – giving his daughter the tools to distinguish between AI-generated and real content.

“I would never have defined it as [an AI policy]. I would never have defined it so well, but my daughter knows what AI is. We’ve used AI together. We’ve used AI constructively, and I’ve shown her that you can use AI constructively.”

He notes, however, that peer influence is already shaping how children engage with the technology.

“She’s also seen some of the content creators that she follows using AI as well, which is an interesting development, because now she’s learning about AI from other sources as well. That has not yielded any negative results for now, but I am cautious as to what will happen in the future.”

His approach to AI policy, for now, is developmental rather than restrictive. “It’s been less of a policy and more of building an understanding of what this technology is, since it’s already around us.” But he is clear-eyed about the fact that the conversation will need to evolve. “If I had a teenager, I would probably be telling that teenager about the importance of not using AI to bully people or alter people’s body image.”

‘The space is changing so quickly’

Amanda ‘Mandy’ Ndlangisa, a copywriter, social media manager, content creator and mother of two, echoes this measured approach. Her household does not yet have a formal AI policy, but the thinking behind one is already in place.

“We don’t have a formal AI policy in our home just yet. The space is changing so quickly, and the girls are still quite young, so I’ve been leaning more into guiding them with simple principles rather than setting strict rules. Things like staying curious, asking questions, and understanding that AI is a tool, not something you just trust blindly,” said Ndlangisa.

However, with one daughter in matric, Ndlangisa has had to become more deliberate.

“It’s important to me that AI supports her learning, but doesn’t replace it. I really want her to still think for herself, do the work, and build that discipline, especially in such a big academic year.”

Her informal AI policy coalesces around four pillars: age-appropriate guidance, critical thinking, privacy protection, and a healthy balance between screen time and real life.

“At the end of the day, I think it’s about raising kids who are comfortable with technology, but still grounded in who they are, confident in their own thinking and creativity.”

What the experts say: Think structure, not panic

Education experts are aligned with both parents’ instincts. Lientjie Pelser, head of school at Teneo Online School (where students already integrate digital tools into daily schooling), says the goal is neither prohibition nor unchecked access.

“Parents worry that AI will either do the work for children or expose them to things they’re not ready for. The answer isn’t fear, it’s structure. If children know what AI is allowed for, what they should never share, and that they must still show their own thinking, it becomes a learning support tool rather than a shortcut,” said Pelser.

She believes in AI’s potential to address exam anxiety in under-resourced communities, an urgent and very South African problem.

New research from the school found that residents of Nelspruit search for exam stress-related terms at 2.9 times the per capita rate of Johannesburg, a pattern that tracks against provinces with fewer mental health services and lower internet penetration.

Should parents draft an AI policy for their homes? South Africans weigh in
For most families, drafting an AI policy for the home needs to start now – before the technology outpaces the conversation entirely. Picture: iStock

“Exam stress rarely comes from the work itself. It comes from a learner sitting with a concept they don’t understand, at 9pm, with no one to ask. That window of isolation is where anxiety compounds. The good news is that parents can now close that window in ways that weren’t possible even two years ago,” she said.

She points to tools such as Google’s NotebookLM, which allows students to upload study material and interact with it conversationally.

“A parent in Nelspruit doesn’t need to understand the Grade 11 Chemistry syllabus to support their child through it. They need to know which tools exist and how to help their child use them.”

The global picture is moving fast

South Africa is not navigating this in isolation. In the United Kingdom, the Department for Education has announced plans to co-create “safe AI tutoring tools” with teachers and technology partners, with rollout expected by the end of 2027 and the potential to support up to 450 000 disadvantaged pupils annually.

Closer to home, South Africa’s Department of Science, Technology and Innovation has publicised the launch of IRIS, an AI-powered robot tutor designed to support teaching and learning in South African languages.

The message from both governments is the same: AI in education is moving into the mainstream and families need practical frameworks, not panic.

10 rules for safer AI use at home

Teneo Online School’s academic team has developed a parent checklist as a practical starting point for households looking to formalise their approach. Their AI policy recommendations include:

  1. Start with one purpose. Decide what AI is permitted for first – explaining concepts, practising questions, summarising notes – rather than giving open-ended access.
  2. Agree on what your child should never share. No full name, address, school name, phone number, photos, or any identifying personal information.
  3. Teach the critical skill: AI can be wrong. Build a verification habit. This includes checking class notes, textbooks, or trusted sources.
  4. Make “show your understanding” non-negotiable. If a child cannot explain an AI-assisted answer in their own words, the tool has replaced learning rather than supported it.
  5. Separate learning help from answer writing. AI can assist with understanding and planning, but should not produce final work submitted as the child’s own.
  6. Keep AI use visible early on. Use it in shared spaces or ask children to show how they used it, not as surveillance, but to build healthy habits before they become automatic.
  7. Watch for the confidence trap. If a child reaches for AI the moment they feel uncertain, they may be losing resilience. Encourage a pause first.
  8. Set clear rules for sensitive topics. Children should know exactly what to do if content becomes inappropriate or distressing: stop and show an adult.
  9. Keep screen-time boundaries intact. AI can quietly expand screen time because it feels productive. Protect offline time deliberately.
  10. Review your rules after two weeks. Ask: Is this helping with learning? Is it increasing shortcuts? The best household policies are living documents.

About admin