糖心传媒

Little Talks, Big Impact: AI and Well-Being

We are pleased to present the second in our new parent education series - Little Talks, Big Impact - designed to support meaningful connections at home. Each edition provides gentle conversation prompts you can explore with your daughter: at the dinner table, on the way to school, or during a walk together. These prompts are grounded in the core themes of our College鈥檚 well-being framework: being, becoming and belonging.

It is in the interest of the makers of AI that we keep clicking.

Recently, I caught up with two young people in their early twenties. Both had degrees, had excellent jobs and a good network of friends. Both admitted to me that, given some recent emotional distress, they had turned to AI for some insight into their own states of mind and for some advice on how to proceed with a difficult relationship. My conversation with these two young people was mirrored by our Pamela Nutt Address speaker, Associate Professor Sarah Irving-Stonebraker, who told us of young people who are increasingly reporting that they feel 鈥渄isconnected鈥.The advantage of reaching out to ChatGPT or the like is that the bot is constantly available, night or day, and provides instant, positive, affirming feedback at a moment of psychological or emotional distress. Dr Burgis recently met with NSW Police, whose statistics tell us that 72% of young people have used some sort of AI companion. In an uncanny way, the 2013 film Her - where a man falls in love with a chatbot - predicted the strange situation in which we find ourselves in 2025.

And yet, if the smartphone revolution has taught us anything, it is this: what is supposed to connect us can, in the end, disconnect us. The famous line from The President鈥檚 Men is a useful one here: 鈥渇ollow the money.鈥 The imperative for smartphone or AI companies is an economic one; it is in the interest of the makers of AI that we keep clicking. A person feeling disconnected from other humans will be more likely to keep clicking, driven by a feeling of disconnection, and an AI bot will be more likely to provide answers that increase dependency rather than decrease dependency. In the end, then, that 24/7 well-being advice may in the end be a ruse. As reported recently in The Guardian, Psychologist Sahra O鈥橠oherty says that AI chatbots are not designed to bring therapeutic benefit, but are designed to 鈥渕irror鈥 our wants and needs: 鈥淲hat it is going to do is take you further down the rabbit hole, and that becomes incredibly dangerous when the person is already at risk and then seeking support from an AI.鈥

Social media and girls

In a recent interview, Jonathan Haidt, author of The Anxious Generation, discussed the impact of social media on young girls, referencing insights from Sarah Wynn-Williams鈥 book Careless People. A former executive at Meta, Wynn-Williams now shares her concerns about the deep, and often invisible, influence of social media companies on the self-worth of young users.

One alarming insight: if a girl deletes a number of selfies from Instagram, the algorithm detects a pattern of self-consciousness or self-loathing. Instead of offering supportive or affirming content, the algorithm responds by promoting beauty products. The implicit message becomes: 鈥淵ou鈥檙e not beautiful. Let us help you become beautiful.鈥

This troubling cycle, as Naomi Wolf observed decades ago, commercialises a girl鈥檚 pain rather than addressing it as a social concern. Her vulnerability becomes a source of profit.

We are already seeing this play out locally. Mrs Watters (Head of Junior School) recently wrote about children as young as 10 asking for beauty products. The founder of skincare company Go-To has also commented on this phenomenon, highlighting a booming market driven by fear rather than self-expression.

What does this mean for the young people of today?

Bringing to mind the fact that a teenager鈥檚 brain is far more elastic than an adult brain and also has an immature prefrontal cortex (which is the part of the brain that processes risk and consequence), our teenagers are at risk of replacing real human connection with artificial connection, thereby amplifying the disconnection that caused pain in the first place. AI add-ons have been a part of social media for some time now. In Snapchat, it is called 鈥淢yAI鈥. TikTok has 鈥淭ako鈥 or 鈥淕enie鈥. Instagram has an AI chatbot built into its DMs. ChatGPT is illegal for anyone under the age of 13, but this also means that many of our teenagers could have the app installed on a phone. As adults come to terms with what AI might mean for our world, teenagers and young people are taking up its use in real time.

It is tempting to think that one way around this might be to limit distress in general, so that our girls have no need to reach for a bot for friendship and support. This will prove impossible. While we cannot (and should not) construct worlds for our children free of psychological distress or difficulty, we can teach them what to do when these big emotions inevitably surface. We can show them that all humans experience distress and difficulty, and then demonstrate to them wise ways of responding. In teaching and mentoring, we will establish the human connection that drives 鈥渇riendship鈥 with AI in the first place.

Here are some ideas for navigating this new territory of AI and growing daughters:

  • Keep conversations with your daughter open. Use the car ride home - or what I have called the 鈥渢eenage pram鈥 - to chat with your daughter about her day. Teenagers habitually find the communal gaze straight ahead and the movement of the car as a sign that it is safe to talk. Let her lead, and offer her gentle questions to prompt her to keep talking.听
  • Explain to your daughter that she should always come to you or to our Well-Being staff at school if she is having trouble handling those 鈥渂ig emotions鈥. We will work together to give her language and frameworks for those feelings.听
  • Use light-hearted humour to show your daughter what to do with big emotions. Avoid sarcasm, but show her how humour helps to get a handle on emotions with no name. The films Inside Out and Inside Out 2 are fabulous on this point.听
  • Check your daughter鈥檚 devices for AI bots. If you find access to them, talk to her about AI - what it is, what it isn鈥檛 - and the limitations of non-human connection.听
  • Remind her of resources like Kids Helpline (1800 551 800) and Beyond Blue (1300 224 636), and explain how they differ from AI bots. Please refer to page 23 of the Senior Student Handbook and page 11 of the Junior Student Handbook for more suggestions on Who Can Help?
  • If necessary, arrange for your daughter to see an external psychologist to ensure she has human connection in this crucial time in her life.听
  • Educate yourself on some of the more alarming 鈥渃onnections鈥 with AI companions, so that you are aware of the invitations made to young people. The eSafety Commissioner has put together a 45-minute webinar, 鈥淯nderstanding AI Companions: What parents and carers need to鈥痥now鈥, on Thursday, 28 August. You can .听
  • If you see anything alarming, please reach out to us or directly to the eSafety Commissioner.听聽

For further support or information, please contact the Senior School Well-being Team via Ms Liz D鈥橝rbon: 别诲补谤产辞苍蔼辫濒肠.苍蝉飞.别诲耻.补耻听

  • eSafety Commissioner Website:
  • Docter, Pete, Director. Inside Out 1 and 2. Pixar Animation Studios, 2015 and 2024.
  • Jonze, Spike, Director. Her. Annapurna Pictures, 2013.
  • Taylor, Josh. Accessed 6 August 2025.

Dr Sarah Golsby-Smith

Head of Learning and Teaching at 糖心传媒

Sarah has also taught in both government and independent schools, as well as across co-ed and both single sex schools i.e. girls schools and boys schools.