Skip to content ↓
  • Is it a real friend or an AI Chatbot?

    Online Safety Information for Parents from Internet Matters

    Is your child chatting to a real friend - or an AI chatbot?

    AI Chatbots are built into platforms children are using every day on their phones and other devices. Research from Internet Matters, shows that two-thirds of children are using AI chatbots like ChatGPT, Snapchat's My AI, character ai and others, and they're being used for schoolwork, to seek advice and even for companionship*.

    Chatbots interact in a human-like way, they are always available, are friendly and non-judgemental, and use empathetic language - which can make children feel acknowledged and understood. However, this makes it harder for children to recognise that they're interacting with a tool rather than a real person. Chatbots are not real and their responses cannot always be trusted.

    Also, they're often not designed with children in mind - most lack safety settings or parental controls. So if children ask for advice on sensitive topics, the lack of age checks plus inconsistent filtering mean they may be presented with responses that are inappropriate for their age.

    To help you get to grips with what AI chatbots are, and for advice and tips on how you can help your children use them safely, check out Internet Matters new AI information hub. https://www.internetmatters.org/advice/by-activity/using-artificial-intelligence/

    Age checks are coming

    By 25th July, some online platforms will be required to have age checks in place to help give children a safer experience - this includes Roblox, Discord, Fortnite, YouTube, amongst others.

    Internet Matters have partnered with Verifymy, to help you understand exactly what “Age checks” mean, how it helps and bust some common myths. Check out their new resource to learn more.