Who cleans the Internet? It’s a question rarely asked, and even less frequently answered. But every time a violent video disappears from YouTube or disturbing content never reaches our eyes, it’s because someone has already seen it for us. A real person. And that’s where a largely invisible human story begins: the story of content moderators.
A Necessary, Yet Invisible Job
Content moderators are workers—often subcontracted by third-party firms rather than employed directly by Google—whose job is to review thousands of videos daily to ensure they comply with platform rules. Pornography, explicit violence, animal abuse, hate speech, fatal accidents, self-harm... It’s not unusual for a single shift to include several of these categories.
Even though machines and algorithms have improved at automatically filtering out inappropriate content, human judgment remains irreplaceable for the most sensitive cases. Is this video from a documentary or a real execution? Was this fight consensual or an assault? Is this speech satire or hate incitement? Deciding that requires context, empathy, and above all, watching the content—directly, unfiltered.
Psychological Impact: Secondary Trauma Through Exposure
Recent studies and personal testimonies gathered by international media show that many YouTube moderators exhibit symptoms similar to post-traumatic stress disorder (PTSD). Even though they haven’t lived the events themselves, the brain reacts as if it had.
Clinical psychologist Jennifer Beckett, an expert in vicarious trauma, explains that the nervous system doesn’t fully distinguish between what is experienced firsthand and what is repeatedly witnessed in videos—especially without time to process the material or when working under performance pressure. The body reacts with tension, hyperarousal, insomnia, anxiety, and in some cases, emotional numbing or dissociation.
Moreover, the repetitive and unpredictable nature of the job creates a kind of “digital hypervigilance.” Moderators live in a constant state of alert, bracing for the next disturbing clip, never knowing which image might leave a lasting emotional scar.
The Wear and Tear of Visual Stress
This experience also leads to what’s known as chronic visual stress—a specific type of sensory overload caused by daily exposure to chaotic, disturbing, or emotionally intense images. Vision, our dominant sense, becomes a gateway for experiences that overwhelm the brain’s ability to process emotional input.
Images of mutilated bodies, screams of despair, faces distorted by fear or pain… These aren’t just data. They’re emotional triggers that affect the limbic system, the part of the brain that processes emotions. The result can be physical responses: rapid heartbeat, sweating, nausea, a sense of unreality. Without space to process these impressions, mental health begins to suffer.
The Algorithm’s Silence
The paradox is clear: the more effective the moderators are, the less aware we are of the type of content that circulates online. We don’t see the blood, the abuse, the insults—because someone else already did. This creates a form of invisible violence, where those who protect us from the dark side of the internet are rendered invisible.
Some platforms, under public pressure and legal challenges, have started to implement mental health support programs. But many workers report that these measures come too late—or that there’s no real culture of care. Often, they face stigma: “How can you be stressed if you’re just watching videos?” The lack of understanding about secondary trauma and visual stress adds to the isolation moderators feel.
We Need a Different Lens
This raises an ethical and technological dilemma: how do we keep the internet clean without harming the people who do the cleaning? Could we imagine a more humane approach to moderation—with better working conditions, content rotation, mental health support, and clear limits?
This is also a chance for users to reflect: to develop critical awareness about what we consume. Behind every deleted video is someone who had to see it so we wouldn’t have to. Acknowledging that reality is not just an act of empathy—it’s a step toward a more just digital ecosystem.
Caring for the Caregivers
In the end, content moderators don’t just filter videos—they filter human suffering so it doesn’t hit us all at once. They are the invisible guardians of our experience on platforms like YouTube, the first to witness horror and the last to be acknowledged.
Maybe now is the time to look at them, too. To make their work visible. To create emotional decompression spaces, ethical protocols, and work environments where no one has to choose between their mental health and their job.
Because caring for digital health also means caring for those who protect it—with their own eyes.