Digital Parenting

Meta and YouTube Found Liable: What Parents Should Do Now

You have been saying this for years. That the apps are designed to keep them hooked. That the controls do not work. That something about this is not right.

Yesterday, a jury agreed with you.

Diagnosed AuDHD 12 Years in SEN Schools Washington Post Featured
Quick Answer

On 25 March 2026, a California jury found Meta (Instagram) and YouTube liable for designing addictive products that harmed young people. Meta was also ordered to pay $375M in a separate child exploitation case. This is the first social media addiction trial to reach a verdict. It confirms what parents have been saying for years: these platforms are engineered for engagement, not safety.

Sound Familiar?

  • You have felt like your child's phone knows them better than you do
  • You have watched a 10-minute YouTube break turn into two hours
  • You have said "just put it down" and been met with rage, tears, or complete shutdown
  • You have wondered whether this is normal or whether something is genuinely wrong
  • You have Googled "is Instagram safe for kids" at midnight
  • 💬

    You are not imagining it. And as of yesterday, a jury agrees with you.

    A jury just confirmed what parents have been saying for years: these platforms are engineered for engagement, not safety.

    What the Verdict Actually Says

    A jury in Los Angeles found that Meta (which owns Instagram) and Google (which owns YouTube) designed products that are defective and unreasonably dangerous for young users. The jury awarded over $6 million in damages to the plaintiff, a young woman who became addicted to the platforms as a teenager.

    In a separate case decided the same week, Meta was ordered to pay $375 million after a jury found it liable for child exploitation on its platforms.

    This is not a warning. It is not a recommendation. A jury of 12 people looked at the evidence and concluded: these products harmed children, and the companies knew.

    $381M+ Total damages awarded against Meta and YouTube in one week

    Why This Matters for Your Family

    This verdict does not change what is on your child's phone today. Instagram will still have infinite scroll tomorrow morning. YouTube Shorts will still autoplay. Meta AI will still be embedded in WhatsApp.

    What it changes is the conversation. For years, parents have been told they just need to "set better boundaries" or "have a conversation about screen time." That framing puts the responsibility on you. This verdict puts it where it belongs: on the companies that designed these products.

    But here is the uncomfortable truth: knowing who is responsible does not fix the problem in your house tonight. The apps are still there. Your child is still using them. The question is the same as it was yesterday: what do you actually do?

    That is what the rest of this page is for.

    What This Means for Instagram

    Instagram was specifically named in the verdict. The jury found that its design features, including infinite scroll, push notifications, and algorithmic content recommendations, were defective when used by minors.

    If your child uses Instagram, this does not mean you need to delete it tonight. But it does mean the safety features Instagram offers are not enough. They were not designed to protect your child. They were designed to look like they protect your child.

    I have written a full breakdown of every Instagram risk and what to actually change in settings: Is Instagram Safe for Kids? What Parents Actually Need to Change.

    What This Means for YouTube and Shorts

    YouTube was also found liable. Specifically, the autoplay and recommendation systems that keep children watching were found to be defective product designs.

    Here is what most parents miss: your child might not have TikTok, but if they are watching YouTube Shorts, they are using the same infinite-scroll algorithm. Shorts was YouTube's response to TikTok. Same design. Same dopamine loop. Same problem.

    The built-in parental controls do not block Shorts effectively. I have tested every available setting and written up what actually works: YouTube Shorts: The Hidden Risk Inside YouTube.

    1 click That is all it takes to go from a supervised YouTube video to an unfiltered Shorts feed

    What This Means for WhatsApp and Meta AI

    Meta AI is already inside your child's Instagram, WhatsApp, and Messenger. Your child did not install it. You cannot fully remove it. And this week's verdict makes clear that Meta's track record on child safety is not one that should inspire confidence.

    If your child uses any Meta platform, they are interacting with Meta AI whether they chose to or not. I have written a full guide on the 5 specific risks and what you can realistically do about them: Is Meta AI Safe? 5 Risks Already on Your Child's Phone.

    And if your child is using ChatGPT or other AI chatbots, the safety picture is different but equally important: Is ChatGPT Safe for Kids? What It Won't Tell You.

    5 Things to Do Today

    You do not need to confiscate every device in the house. But today is a good day to take one or two concrete steps. Here is where to start:

    1. Check what your child actually uses. Not what you think they use. Ask to see their screen time report (Settings > Screen Time on iPhone, Digital Wellbeing on Android). Look at which apps are getting the most hours. The answer often surprises parents.
    2. Turn off autoplay on YouTube. This is the single highest-impact change you can make in 30 seconds. YouTube Settings > Autoplay > Off. This alone breaks the infinite loop the jury found to be defective.
    3. Review Instagram notification settings. Every push notification is a hook designed to pull your child back in. Turn off everything except direct messages from real friends.
    4. Have the conversation, but lead with curiosity. Do not open with "I read that Instagram is bad for you." Open with "I saw this news story. What do you think about it?" Let them talk first. You will learn more.
    5. Check one safety guide relevant to your child. I have written detailed, practical guides for every major platform:
    Instagram Safety Settings, risks, what to change YouTube Shorts Safety The hidden risk inside YouTube Meta AI Safety 5 risks already on their phone ChatGPT Safety What it won't tell you TikTok Safety Controls, risks, alternatives Fortnite Safety Why they can't stop playing

    Need Help With Screen Time?

    If today's verdict has you thinking about your family's relationship with screens, I can help. One conversation is often all it takes to build a plan that actually works.

    Video consultations worldwide Personalised plan included 1,000+ families supported
    Book a Session With Daniel — £75 / $95
    Personalised plan included · Families worldwide · 1,000+ families supported
    Video consultations worldwide
    No waiting list
    Personalised action plan included

    Your Questions Answered

    Does this verdict mean Instagram and YouTube will change?

    Not immediately. Both companies will likely appeal, and regulatory change takes time. The product your child is using today will be the same product tomorrow. That is why parental action matters now, not later.

    Should I delete Instagram from my child's phone?

    That depends on your child's age and how they use it. For most families, adjusting settings and having an honest conversation works better than a sudden ban, which often leads to workarounds. Start with my Instagram safety guide for specific steps.

    Is YouTube Kids affected by this verdict?

    The verdict specifically relates to YouTube's main platform, not YouTube Kids. However, YouTube Kids has its own limitations, and many children aged 8+ migrate to the main app. The autoplay and Shorts features on the main app are the primary concern.

    What about TikTok? Was it included in this lawsuit?

    TikTok was not a defendant in this specific trial. However, separate lawsuits against TikTok are ongoing and use similar arguments about addictive design. My TikTok safety guide covers the current risks and settings.

    I am worried about my child's screen time. Where do I start?

    Start with a screen time check: look at your child's actual usage data, not what you assume. Then pick the one platform they use most and read the relevant safety guide on this site. If you want personalised help, you can book a session with me.

    Daniel Towle, Screen Time Coach

    About Daniel Towle

    Screen Time Specialist • Diagnosed AuDHD • Washington Post Featured

    Daniel Towle is a screen time specialist and diagnosed AuDHD adult with 12 years of experience in education. He personally understands the pull of gaming and social media — he felt it as a teenager and built his own system to manage it.

    He now helps families across the UK and beyond build practical, shame-free approaches to screen time that actually last.

    Daniel writes about every platform your child uses, with practical safety guides trusted by thousands of UK parents. He has 12 years of experience in education and personally understands the pull of gaming and social media.