Filipinos are the secret workforce that filter vaginas and violence from your newsfeed
Hate speech, pornography, and political incorrectness are common in the Internet. Sometimes you read comments or see pictures that make you feel hopeless about humanity.
Opposite all the hate and anger, there are also people whose job it is to scrub the internet clean of such filth. Some social networking sites don’t do all the dirty work in making sure that our news feeds are SFW. Instead, they outsource labor from—where else—but here in the Philippines.
Instead of customer service associates, US-based tech companies depend on teams of Filipinos to moderate content on the internet: from flagging down offensive speech to removing graphic media such as videos of beheadings.
Wired reporter Adrian Chen talked to Michael Baybayan, a former employee of content moderation firm TaskUs. Chen goes at length to dissect the industry of content moderation, and it’s a truly engaging, must-read piece.
The Philippine office of TaskUs was first based in Bacoor but has now moved to Taguig. Baybayan and his co-workers screened content from Whisper, a mobile app where people can confess their secrets anonymously. Via Wired:
I was given a look at the Whisper moderation process because Michael Heyward, Whisper’s CEO, sees moderation as an integral feature and a key selling point of his app. Whisper practices “active moderation,” an especially labor-intensive process in which every single post is screened in real time; many other companies moderate content only if it’s been flagged as objectionable by users, which is known as reactive moderating. “The type of space we’re trying to create with anonymity is one where we’re asking users to put themselves out there and feel vulnerable,” he tells me. “Once the toothpaste is out of the tube, it’s tough to put it back in.”
Content moderation jobs are increasing in the country because of our close cultural ties with the west. Filipinos can easily determine what Americans may find offensive.
Baybayan and his co-workers’ main job was to make sure that nothing graphic makes it online:
If the space does not resemble a typical startup’s office, the image on Baybayan’s screen does not resemble typical startup work: It appears to show a super-close-up photo of a two-pronged dildo wedged in a vagina. I say appears because I can barely begin to make sense of the image, a baseball-card-sized abstraction of flesh and translucent pink plastic, before he disappears it with a casual flick of his mouse.
Content moderators are exposed to hours of pornography, animal crush videos, and other graphic images. This is obviously a physically and emotionally draining job, which eventually leads to burn out, or something much worse, such as paranoia and trauma:
In Manila, I meet Denise (not her real name), a psychologist who consults for two content-moderation firms in the Philippines. “It’s like PTSD,” she tells me as we sit in her office above one of the city’s perpetually snarled freeways. “There is a memory trace in their mind.” … But even with the best counseling, staring into the heart of human darkness exacts a toll. Workers quit because they feel desensitized by the hours of pornography they watch each day and no longer want to be intimate with their spouses. Others report a supercharged sex drive. “How would you feel watching pornography for eight hours a day, every day?” Denise says. “How long can you take that?”
Before they are hired, employees have to pass a series of interviews and psychological tests to make sure that they can handle the demands of the work. While working, they are also entitled to counseling, but not everyone has access to it:
(Former YouTube content moderator) Rob began to dwell on the videos outside of work. He became withdrawn and testy. YouTube employs counselors whom moderators can theoretically talk to, but Rob had no idea how to access them. He didn’t know anyone who had. Instead, he self-medicated. He began drinking more and gained weight.
Wired also talked to Maria, a quality-assurance representative. Her job is to double-check her team’s work in flagging down content:
Maria is especially haunted by one video that came across her queue soon after she started the job. “There’s this lady,” she says, dropping her voice. “Probably in the age of 15 to 18, I don’t know. She looks like a minor. There’s this bald guy putting his head to the lady’s vagina. The lady is blindfolded, handcuffed, screaming and crying.” … “I don’t know if I can forget it,” she says. “I watched that a long time ago, but it’s like I just watched it yesterday.”
In the long run, content moderators become desensitized to graphic content. But seeing so much gruesomeness can truly affect their psychological health.
Read the entire piece here.