- Natalie Boll
- 0 Comments
- 281 Views
The new Netflix docuseries Adolescence struck a deep chord. Raw, emotional, and at times unsettling, the series peels back the curtain on what life is really like for children growing up online. For many adults, it was the first time they truly saw the digital world through the eyes of a child. And what they saw was alarming.
In just a few episodes, Adolescence reveals what researchers and young people have been saying for years. Social media platforms are not neutral spaces. They are environments engineered to influence behavior, shaped by systems that decide what each person sees based on invisible criteria. At the center of this experience is the recommendation algorithm.
Most platforms describe their algorithms in friendly, simplified terms. They say things like, “We show you more of what you love.” It sounds helpful, even thoughtful. But behind this idea is a system driven by data, engagement patterns, and predictive modeling. The goal is not to serve the user, but to keep them engaged for as long as possible.
Platforms like TikTok, Instagram, Facebook, and YouTube collect an enormous amount of data from the moment someone creates an account. This includes age, gender, device data, location, scroll behavior, pause time, and the activity of connected accounts. They also track off-platform behavior through cookies, pixels, and third-party integrations. In some cases, biometric and facial recognition data have also been used to refine user profiles (The New York Times, 2021).
Before a user has even liked a post or watched a video, the platform already begins shaping their feed. This is especially troubling when it comes to young users. Studies have shown that teen test accounts are quickly funneled into harmful content. According to research by the Center for Countering Digital Hate, girls were served self-harm and eating disorder-related content within minutes, while boys were shown violent and extremist videos in a similarly short time frame (CCDH, 2022). A report by the BBC also confirmed this trend, calling attention to how algorithmic recommendations target children with age-inappropriate and disturbing material (BBC, 2023).
The reason is simple. These algorithms are not built to support mental health. They are built to capture attention. And content that is shocking, emotional, or controversial often performs best.
Imagine a real-world analogy. You are driving and pass a car accident. Most people slow down to look. An algorithm observing that behavior would interpret it as interest and assume people want to see more car accidents. If this logic seems flawed in the physical world, it is even more dangerous online. Because the consequence is a feed that promotes violence, self-harm, and hyper-stimulating content, especially to those most vulnerable.
This isn’t just theoretical, it’s personal. My stepson once witnessed a video of someone taking their own life. It first appeared on TikTok, without him searching for it. He was told to watch the full version on X (formerly Twitter), and he followed the link. He hadn’t asked for the video in the first place, but it autoplayed. And once he saw it, he couldn’t unsee it.
Despite platform policies that restrict underage users, internal documents leaked in the Facebook Files investigation revealed that children as young as nine were regularly using Meta platforms, often without meaningful safeguards in place (The Wall Street Journal, 2021). These platforms knew it, and little was done.
This is why we built Tribela.
After speaking with hundreds of teens, parents, and mental health professionals, we understood that the problem is not just the content, but the system that delivers it. At Tribela, we are building a different kind of recommendation system. One where users, not algorithms, control what they see. Our goal is to offer personalization without manipulation, exploration without harm, and connection without compromising safety.
There is no violent or explicit content. No autoplay loops designed to hook. No algorithm that serves you what everyone else in your demographic is watching. Instead, Tribela empowers users to shape their experience intentionally. We believe young people deserve a platform that respects their attention, their identity, and their well-being.
This summer, we will launch our closed beta. If you have a teen who wants to try a safer, more empowering platform, we invite you to join our waitlist. If you are part of a school, group, or organization interested in helping shape this new digital space, we welcome you to be part of it.
We believe that social media can be better. But only if we build it differently.
References
BBC (2023). It stains your brain’: How social media algorithms show violence to boys. [online] Available at: https://www.bbc.com/news/articles/c4gdqzxypdzo [Accessed 23 Mar. 2025].
Center for Countering Digital Hate (2022). Deadly by Design: TikTok’s algorithm delivers harmful content to teens. [online] Available at: https://counterhate.com/research/deadly-by-design/ [Accessed 23 Mar. 2025].
The New York Times (2021). Meta to Shut Down Facial Recognition System. [online] Available at: https://www.nytimes.com/2021/11/02/technology/facebook-facial-recognition.html [Accessed 23 Mar. 2025].
The Wall Street Journal (2021). The Facebook Files. [online] Available at: https://www.wsj.com/articles/the-facebook-files-11631713039 [Accessed 23 Mar. 2025].