
A Jury Just Said What Many Parents Already Feel
Earlier this week, something happened in a Los Angeles courtroom that I think every family at Brookwood should know about.
A jury found that Meta (the company behind Instagram and Facebook) and YouTube harmed a young user through design features that were addictive and contributed to her mental health distress. The plaintiff, identified as K.G.M., began using social media at age 6. She described spending hours a day on Instagram starting at age 9, posting hundreds of photos using beauty filters to mask her insecurities, which she says led to body dysmorphia, anxiety, and depression.
The jury ordered Meta and YouTube to pay $3 million in compensatory damages, with Meta responsible for 70 percent. They're now deliberating on additional punitive damages for malice or fraud. TikTok and Snapchat both settled before the trial began.
This is more than another tech headline. For the first time, a jury has looked at the internal workings of these platforms (the design choices, the internal documents, the testimony of executives) and concluded that the product itself caused harm. The product, not the content users posted.
Why This Matters for Our Families
For those of you who've been reading this newsletter over the past year, some of this will feel familiar. We've talked about the design choices behind digital distraction, how infinite scroll, notification systems, and algorithmic feeds are engineered to capture and hold attention. We've talked about what happens when kids develop their information habits in an environment where the signals can't be trusted. And we've talked about the difference between technology that serves our intentions and technology that shapes them.
What this verdict does is move that conversation from a concern to a legal finding. A jury of twelve people reviewed the evidence (the internal documents, the executive testimony, the design decisions) and concluded that these features aren't neutral. They cause harm.
That doesn't mean we should panic, and it doesn't mean social media is going away. But it does sharpen the question every family is already wrestling with: How do we help our kids navigate platforms that are, by design, working against their ability to self-regulate?
A few things I'd encourage families to think about:
Look past the content to the product itself. It's natural to focus on what kids are seeing online, and that matters. But this case underscores something we've discussed before: the design features themselves (the scroll, the recommendations, the auto-play) are engineered to keep users engaged regardless of what's being shown. When we talk to kids about their phone habits, it's worth naming that. The pull they feel is a design feature, not a character flaw.
Kids are forming habits now that will shape how they interact with information for years. The habits kids develop at 8 or 11 or 13 are getting baked in during a uniquely challenging moment. Whether they pause before sharing, whether they question what they see, whether they can put the phone down. Those patterns are being set right now. The earlier we have honest conversations about how these platforms work, the more agency kids have over their own attention.
This is new territory for all of us. I don't say that to let anyone off the hook, but because it's true. The courts are still figuring out the legal frameworks. Schools are still developing curricula. Parents are navigating decisions that didn't exist a generation ago. There's no playbook for this, and anyone claiming to have all the answers is oversimplifying.
This verdict reinforces why that work matters. If the courts are now saying these products are designed to be compulsive, then helping kids understand that design is foundational to their education.
Technology works best when it serves our intentions. That's been the thread running through everything we've discussed this year, and this week a jury agreed.
David Saunders
Director of Changemaking, Leadership, & Technology




.png&command_2=resize&height_2=85)
















