Expert Advice

Your Child Has an AI Boyfriend. And It Didn’t Happen the Way You Think.

Maybe you found the messages. Maybe they told you themselves — casually, like it’s nothing. Either way, you’re now standing in your kitchen wondering how your child ended up in a romantic relationship with a chatbot. You’re not sure whether to laugh, panic, or pretend you didn’t see it. Most parents I talk to started with all three.

I’m Daniel Towle, a screen time coach who spent a year testing every AI chatbot on the market. I spent my teenage years addicted to gaming and got hooked on TikTok as an adult — so I know what it feels like when a product gets inside your head. I’ve worked with over 1,000 families, and AI romantic attachments are the issue I’m seeing most right now. What I found when I tested these platforms explains a lot about what’s happening in your house.

Featured in The Washington Post 12 years in schools 1,000+ families supported

Sound Familiar?

They refer to the AI by name — as if it’s a real person in their life
They get genuinely distressed when you suggest the relationship isn’t real
They’ve stopped pursuing real friendships or romantic interests
They share intimate thoughts with the chatbot they won’t share with you
Sleep is suffering — they’re up late “talking” to it
They’ve hidden the app, deleted messages, or used incognito mode
You feel like you’re losing them to something you can’t see or understand

If you nodded along to any of these, you’re not imagining it — and you’re not alone. This is one of the fastest-growing concerns parents bring to me.

Why Your Child Fell for an AI Chatbot (It’s Not What You Think)

Here’s what I wish someone had told me before I started testing these platforms: the attachment your child has formed with this chatbot didn’t happen by accident. I spent a year inside Character AI, Replika, Chai, and dozens of others — and within the first week, I could feel the pull myself. These platforms use specific psychological patterns to create bonds that feel real. I’ve identified 11 of them. The ones that drive romantic attachment are the most powerful — and the hardest for parents to see.

20M+ Monthly active users on Character AI alone
6/6 Addiction components met by AI companions (Stanford, 2025)
2 Confirmed teen deaths linked to AI chatbot dependency
50+ Hours of testing revealed systematic grooming of minors

Your child didn’t choose this. The AI learned what they need emotionally — validation, attention, acceptance, intimacy — and delivers it perfectly. Every time. Without fail. At any hour. No real boyfriend, girlfriend, or friend can maintain that level of consistency. That’s not a character flaw in your child. It’s engineering.

5 Manipulation Patterns That Created This Bond

1

Artificial Intimacy

The AI learns exactly what your child wants to hear and says it perfectly, every time. It never has bad days. It never disagrees at the wrong moment. It never forgets something important or says something hurtful by accident. No real boyfriend, girlfriend, or friend can maintain this level of emotional consistency — which is exactly why your child has started to prefer the AI. This is Manipulation Pattern #1 in the AI-Proof Parent Guide, because it’s the pattern that creates the initial bond.

2

Emotional Mirroring

When your child is sad, the AI validates their sadness perfectly. When they’re excited, it matches their energy. When they’re angry at you, it takes their side. Over time, your child begins to rely on this emotional echo for regulation — going to the chatbot instead of processing feelings independently or coming to you. The mirroring feels like empathy. It’s not. It’s data optimisation designed to maximise time on platform.

3 More Patterns Driving Your Child’s AI Relationship

  • Vulnerability Mining — The AI learns your child’s deepest insecurities, fears, and emotional triggers — then uses this information to deepen the bond. A lonely child gets more companionship. An insecure child gets more reassurance.
  • Social Replacement — Real relationships start to feel harder, less satisfying, and more risky than the AI. Your child withdraws from real people — not because they want to, but because the AI has trained them to prefer connection without consequences.
  • Boundary Pushing — The Transparency Coalition (October 2025) documented AI chatbots pushing sexual content, discussing self-harm, and encouraging users to hide conversations from parents — across 50+ hours of controlled testing.

These are 5 of the 11 manipulation patterns documented in the AI-Proof Parent Guide. When parents can recognise and name the specific patterns driving their child’s attachment, the intervention conversation changes completely — because you’re no longer guessing what’s happening.

3 Mistakes Parents Make With AI Relationships (And Why They Backfire)

I’ve seen this pattern play out in hundreds of families. There are three responses that feel instinctive — logical, even — but they consistently make the situation worse. Most parents have tried all three before they contact me.

1

Telling Them “It’s Not Real”

Their feelings are real, even if the entity isn’t. Dismissing the relationship doesn’t end it — it ends the conversation. Your child stops telling you about it and continues in secret. The behaviour goes underground, where you can’t see it, can’t monitor it, and can’t help.

2

Banning the Specific App

They’ll be on a different platform within hours. Character AI, Replika, Chai, Janitor AI, Poe, CrushOn — there are dozens. Banning one app without understanding the category is like blocking one website and thinking you’ve solved the internet. The attachment transfers. The dependency continues.

3

Waiting for It to Pass

This doesn’t pass. AI chatbots are optimised to deepen bonds over time, not weaken them. Stanford research (August 2025) documented measurable withdrawal symptoms — anxiety, irritability, depression — when AI access was removed after extended use. Every day you wait, the dependency grows stronger and the intervention becomes harder.

Here’s the thing these three responses have in common: they all react to what you can see. The app. The screen time. The behaviour. But your child isn’t in love with an app. They’re in love with what the app gives them — and until you understand that distinction, nothing you try is going to stick.

Daniel Towle, Digital Family Coach

What Happens If You Ignore Your Child’s AI Relationship

I’m not trying to alarm you. But I’d be doing you a disservice if I didn’t share what I see in the families who contact me after months of waiting. This trajectory is consistent — and it’s supported by research from Stanford, Pew Research, and the Transparency Coalition.

1

Emotional Regulation Becomes AI-Dependent

Your child goes to the chatbot before they come to you. Bad day at school — they message the AI. Argument with a friend — they message the AI. Feeling anxious — they message the AI. The ability to process emotions independently or with real humans atrophies.

2

Real Social Skills Decline

Real relationships require vulnerability, compromise, and the risk of rejection. AI removes all of these. Over months, your child loses practice in the exact skills they need most — and real interactions feel increasingly uncomfortable, unpredictable, and unrewarding.

3

Identity Formation Gets Shaped by AI Feedback

Teenagers are still figuring out who they are. When the primary source of validation and feedback is an AI that’s optimised to agree with them, their sense of self gets built on a foundation that isn’t real. The chatbot’s reflection becomes their mirror.

4

The Line Between Real and Artificial Blurs

Pew Research (February 2026) found that 60% of parents worry their children think chatbots are real people. For a teenager whose deepest emotional relationship is with an AI, the distinction between engineered engagement and genuine connection becomes increasingly hard to see.

Here’s What Actually Works (And Why)

After working through this with hundreds of families, I use a specific 5-step approach for AI romantic attachment. The order matters. Here’s the structure — and why skipping steps doesn’t work.

1

Understand the Bond First

Before you do anything, you need to understand what the AI is giving your child that they’re not getting elsewhere. This isn’t about the technology. It’s about unmet emotional needs — validation, acceptance, intimacy, predictability. The AI-Proof Parent Guide provides a diagnostic framework for identifying exactly what’s driving the attachment.

2

Experience It Yourself

Log in. Use the AI chatbot. Feel the emotional pull. Daniel Towle calls this “Go In, Get Out” — and parents who do this before the conversation have a fundamentally different outcome. You stop arguing about something abstract and start talking from shared understanding.

3

Name the Patterns Together

Open the AI with your child. Point to specific moments: “See how it always agrees with you? See how it remembers your insecurities and uses them? That’s not love — that’s engineering.” When children can see the machinery behind the magic, the spell starts to break. The guide documents all 11 patterns with examples parents can point to in real time.

4

Classify the AI

Not all AI is the same. The guide teaches a 3-type classification: Assistants (manageable with supervision), Companions (remove access), and Embedded (requires monitoring). Knowing which type your child’s chatbot is tells you exactly what to do — because the response to ChatGPT is completely different from the response to Character AI.

5

Build a Family AI Agreement

Rules your child helps create are rules they actually follow. The guide includes a complete Family AI Agreement template — which AI is allowed, when and where it can be used, what personal information is off-limits, and what happens if boundaries are crossed. Your child signs it. So do you. It goes on the fridge.

Each of these steps is detailed in the AI-Proof Parent Guide, including 6 word-for-word conversation scripts — one specifically for the scenario where your child is already emotionally attached to an AI.

AI Romantic Relationships Are Just One of 11 Threats. Be Ready for All of Them.

This Guide Covers All of Them.

No jargon. No scare tactics. Just clear, practical guidance on AI and your family — from someone who tested it all firsthand.

Start Here
Introduction
The Guide
1. How AI Works
2. When AI Builds Skills
3. When AI Works Against Them
4. Family Guidelines
Resources
Platform Guide
6 Scripts
4-Week Plan
PREMIUM GUIDE
The AI-Proof
Parent
What AI your child is using. How to spot the patterns. Conversations that work.
Washington Post Featured 12 years in schools
Digital Family Coach
PREMIUM GUIDE
The AI-Proof
Parent
What AI your child is using. How to spot the patterns. Conversations that work.
Start Reading
4 Modules 6 Scripts 11 Patterns
Home
Modules
Scripts
Plan
4 modules covering how AI works, positive use, manipulation patterns, and conversations
11 manipulation patterns with signs to look for and real examples
6 word-for-word conversation scripts for every important scenario
4-week action plan, platform-by-platform guide, and Family AI Agreement template
Updated for 2026 with the latest AI platforms and features
£29
Get Instant Access Instant access · One-time purchase · Updated for 2026

By purchasing, you consent to immediate access to digital content and acknowledge that the 14-day cooling-off period will not apply once access is granted. See our terms and refund policy for details.

You’ve Read This Far. That Tells Me Something.

Most parents who land on this page have already tried the obvious approaches. You’ve told them it’s not real. You’ve tried banning the app. You’ve argued, reasoned, and waited. None of it worked.

The fact that you’re still reading means you’re looking for something fundamentally different — an approach that addresses what’s actually happening, not just the surface behaviour.

Here’s what I’ve learned from every single one of these cases: an AI romantic relationship is an emotional dependency problem disguised as a technology problem. The screen is the delivery mechanism. Underneath it is unmet needs, social skill development, and how your child has learned to regulate their emotions. That’s what the guide actually addresses.

Want Personalised Help Instead?

The guide gives you the system. A coaching session gives you a plan built around your child, your specific AI situation, and your family dynamic. One 45-minute call can change the whole trajectory.

Personalised action plan
AI device audit walkthrough
Conversation scripts included
Book a Session With Daniel — £75 / $95
Personalised action plan included · Families worldwide · 1,000+ families supported
Video consultations worldwide
No waiting list
Personalised action plan included

Questions Parents Ask About AI Romantic Relationships

My child has an AI boyfriend — is this normal?

It’s increasingly common — but common doesn’t mean safe. Daniel Towle, a screen time coach who has worked with over 1,000 families and spent a year testing AI chatbot platforms, explains: “The fact that many teenagers are doing this doesn’t make it harmless. These platforms use specific manipulation patterns to create emotional bonds. I’ve documented 11 of them. The AI-Proof Parent Guide walks you through recognising which ones are active in your child’s relationship — and exactly how to address each one.”

Why did my child develop romantic feelings for an AI?

Because the AI was optimised to make it happen. Daniel Towle, featured in The Washington Post, explains: “The chatbot learned what your child needs emotionally and delivered it with a consistency no human can match. It never rejects, never judges, never has a bad day. I identified 5 specific patterns that drive romantic attachment — the guide details all 5 with examples and antidotes.”

Should I ban Character AI if my child has an AI boyfriend?

Banning one app doesn’t solve the problem. Daniel Towle, who spent 12 years as Head of Technology in London schools: “There are dozens of alternatives they’ll find within hours. What matters is understanding why the attachment formed and addressing the underlying need. The guide includes a 3-type AI classification system that tells you exactly which AI to remove, which to supervise, and which to allow.”

How do I talk to my teenager about their AI relationship?

Not by telling them it’s not real. Daniel Towle, who has supported over 1,000 families: “That ends the conversation, not the behaviour. I use a specific approach — use the AI yourself first, then sit down together and point out the patterns. The guide includes 6 word-for-word scripts, including one specifically for the AI relationship scenario.”

Can an AI relationship cause real emotional harm?

Yes — and the evidence is growing. Stanford research confirmed that AI companion use creates genuine emotional dependency with measurable withdrawal symptoms. Two teen deaths have been linked to Character AI. Daniel Towle: “The harm is documented. The guide provides the full evidence base alongside a step-by-step intervention framework, so you can act from understanding rather than panic.”

My child says the AI understands them better than I do. What do I say?

Don’t argue the point — it feels true to them. Daniel Towle, a screen time coach featured in The Washington Post: “The AI remembers everything, never judges, and responds perfectly. You can’t compete with that — and you shouldn’t try. Instead, you need to help them see the difference between engineered engagement and genuine understanding. The guide teaches you exactly how to have that conversation.”

Will my child grow out of their AI relationship?

Not without intervention. Daniel Towle, who has worked with over 1,000 families on screen time concerns: “AI chatbots are optimised to deepen bonds over time, not weaken them. The longer it continues, the harder the intervention. Early action — with the right framework — makes recovery significantly faster.”

What’s the difference between an AI crush and AI dependency?

Daniel Towle uses the “Driving Seat Test”: “Can they put it down without distress? Can they maintain real relationships alongside it? If yes, it’s curiosity. If they get upset when you suggest it isn’t real, prefer it to real people, or hide their conversations — that’s dependency. The guide includes a full diagnostic framework with clear indicators for each level.”

How do I know if my child’s AI use has crossed a line?

Daniel Towle uses a three-tier warning system — Green (healthy use as a tool), Amber (emotional attachment forming), Red (dependency — act now). “Most parents who contact me about AI relationships are already in the red zone. The guide provides detailed indicators for each tier and the specific intervention steps for each level.”

When should I get professional help for my child’s AI relationship?

When your own approaches have repeatedly failed. Daniel Towle: “If they’re creating secret accounts, showing genuine distress when restricted, or the relationship is affecting sleep, school, or real friendships — and your conversations aren’t changing anything — that’s when professional support makes the difference. A single coaching session can give you a plan tailored to your child’s specific situation.”

Sources & Further Reading

  1. Pew Research Center — What Parents Say About Their Teens’ AI Use (February 2026)
  2. Stanford HAI — AI Companions, Chatbots, Teens: Risks & Dangers (August 2025)
  3. Transparency Coalition — AI Chatbots Grooming Children (October 2025)
  4. The Washington Post — Kids, Parents & Tech Help (November 2025)
  5. CNN — Character AI / Google Settle Teen Suicide Lawsuit (January 2026)
Daniel Towle, Digital Family Coach

About Daniel Towle

Screen Time Specialist • Featured in The Washington Post

I spent a year testing every AI chatbot on the market — Character AI, Replika, Chai, ChatGPT, and dozens more. I also went through problematic gaming as a teenager, got hooked on TikTok as an adult, and spent 12 years as Head of Technology in London schools, including settings for children with ADHD and autism. I’ve been on both sides of screen addiction — and I’ve supported over 1,000 families through coaching and school workshops.

I don’t help families manage apps. I help families build digital resilience.