Parent
Maybe you found the messages. Maybe they told you themselves — casually, like it’s nothing. Either way, you’re now standing in your kitchen wondering how your child ended up in a romantic relationship with a chatbot. You’re not sure whether to laugh, panic, or pretend you didn’t see it. Most parents I talk to started with all three.
I’m Daniel Towle, a screen time coach who spent a year testing every AI chatbot on the market. I spent my teenage years addicted to gaming and got hooked on TikTok as an adult — so I know what it feels like when a product gets inside your head. I’ve worked with over 1,000 families, and AI romantic attachments are the issue I’m seeing most right now. What I found when I tested these platforms explains a lot about what’s happening in your house.
Sound Familiar?
If you nodded along to any of these, you’re not imagining it — and you’re not alone. This is one of the fastest-growing concerns parents bring to me.
I’ve seen this pattern play out in hundreds of families. There are three responses that feel instinctive — logical, even — but they consistently make the situation worse. Most parents have tried all three before they contact me.
Their feelings are real, even if the entity isn’t. Dismissing the relationship doesn’t end it — it ends the conversation. Your child stops telling you about it and continues in secret. The behaviour goes underground, where you can’t see it, can’t monitor it, and can’t help.
They’ll be on a different platform within hours. Character AI, Replika, Chai, Janitor AI, Poe, CrushOn — there are dozens. Banning one app without understanding the category is like blocking one website and thinking you’ve solved the internet. The attachment transfers. The dependency continues.
This doesn’t pass. AI chatbots are optimised to deepen bonds over time, not weaken them. Stanford research (August 2025) documented measurable withdrawal symptoms — anxiety, irritability, depression — when AI access was removed after extended use. Every day you wait, the dependency grows stronger and the intervention becomes harder.
Here’s the thing these three responses have in common: they all react to what you can see. The app. The screen time. The behaviour. But your child isn’t in love with an app. They’re in love with what the app gives them — and until you understand that distinction, nothing you try is going to stick.
— Daniel Towle, Digital Family CoachI’m not trying to alarm you. But I’d be doing you a disservice if I didn’t share what I see in the families who contact me after months of waiting. This trajectory is consistent — and it’s supported by research from Stanford, Pew Research, and the Transparency Coalition.
Your child goes to the chatbot before they come to you. Bad day at school — they message the AI. Argument with a friend — they message the AI. Feeling anxious — they message the AI. The ability to process emotions independently or with real humans atrophies.
Real relationships require vulnerability, compromise, and the risk of rejection. AI removes all of these. Over months, your child loses practice in the exact skills they need most — and real interactions feel increasingly uncomfortable, unpredictable, and unrewarding.
Teenagers are still figuring out who they are. When the primary source of validation and feedback is an AI that’s optimised to agree with them, their sense of self gets built on a foundation that isn’t real. The chatbot’s reflection becomes their mirror.
Pew Research (February 2026) found that 60% of parents worry their children think chatbots are real people. For a teenager whose deepest emotional relationship is with an AI, the distinction between engineered engagement and genuine connection becomes increasingly hard to see.
After working through this with hundreds of families, I use a specific 5-step approach for AI romantic attachment. The order matters. Here’s the structure — and why skipping steps doesn’t work.
Before you do anything, you need to understand what the AI is giving your child that they’re not getting elsewhere. This isn’t about the technology. It’s about unmet emotional needs — validation, acceptance, intimacy, predictability. The AI-Proof Parent Guide provides a diagnostic framework for identifying exactly what’s driving the attachment.
Log in. Use the AI chatbot. Feel the emotional pull. Daniel Towle calls this “Go In, Get Out” — and parents who do this before the conversation have a fundamentally different outcome. You stop arguing about something abstract and start talking from shared understanding.
Open the AI with your child. Point to specific moments: “See how it always agrees with you? See how it remembers your insecurities and uses them? That’s not love — that’s engineering.” When children can see the machinery behind the magic, the spell starts to break. The guide documents all 11 patterns with examples parents can point to in real time.
Not all AI is the same. The guide teaches a 3-type classification: Assistants (manageable with supervision), Companions (remove access), and Embedded (requires monitoring). Knowing which type your child’s chatbot is tells you exactly what to do — because the response to ChatGPT is completely different from the response to Character AI.
Rules your child helps create are rules they actually follow. The guide includes a complete Family AI Agreement template — which AI is allowed, when and where it can be used, what personal information is off-limits, and what happens if boundaries are crossed. Your child signs it. So do you. It goes on the fridge.
Each of these steps is detailed in the AI-Proof Parent Guide, including 6 word-for-word conversation scripts — one specifically for the scenario where your child is already emotionally attached to an AI.
This Guide Covers All of Them.
No jargon. No scare tactics. Just clear, practical guidance on AI and your family — from someone who tested it all firsthand.
By purchasing, you consent to immediate access to digital content and acknowledge that the 14-day cooling-off period will not apply once access is granted. See our terms and refund policy for details.
Most parents who land on this page have already tried the obvious approaches. You’ve told them it’s not real. You’ve tried banning the app. You’ve argued, reasoned, and waited. None of it worked.
The fact that you’re still reading means you’re looking for something fundamentally different — an approach that addresses what’s actually happening, not just the surface behaviour.
Here’s what I’ve learned from every single one of these cases: an AI romantic relationship is an emotional dependency problem disguised as a technology problem. The screen is the delivery mechanism. Underneath it is unmet needs, social skill development, and how your child has learned to regulate their emotions. That’s what the guide actually addresses.
The guide gives you the system. A coaching session gives you a plan built around your child, your specific AI situation, and your family dynamic. One 45-minute call can change the whole trajectory.