When you have to see the darkest things on the internet all day so no.

one else needs What does a moderator do? How can you cope with the fact that you have to constantly read horrors because of your work, see to it to protect other people?

It’s natural for us that when browsing our favorite social sites, we don’t find ourselves confronted with violent sequences of images, hateful videos, or adult content. All of this is thanks to moderators working in the background, who work as Internet police officers, sacrificing their mental health to make the online space safe for us. So far, however, we have known little about their work.

When you have to see the darkest things on the internet all day so no

Who are the “cleaners”?

Within the framework of the Budapest International Documentary Film Festival, the 2018 documentary “The Cleaners” was also shown in Hungary, which shows who and how controls the content uploaded to social media. Of course, this is now done in part by algorithms, but content filtering isn’t always perfect, so big companies also need human resources to make users feel safe on social media platforms.

From the film by Hans Block and Moritz Riesewieck, we learn that Facebook employs nearly fifteen thousand moderators, the vast majority of whom live in remote, poor countries.

When you have to see the darkest things on the internet all day so no

The filmmakers moved to Manila, Philippines, for six months to gain insight into the work of local moderators, as much of the content filtering is done there.

Hans Block and Moritz Riesewieck revealed in an interview that it was not an easy task for them to try to gain the trust of the interviewees as they are required to maintain strict confidentiality in the course of their work. Their employers, Facebook, Instagram, Google and other companies keep this activity a secret, so much so that they use codewords to identify companies. Moderators work accordingly in unmarked offices in Manila’s business district, and when asked, they should answer that they are employees of the “Honeybadger Project”. This is the local alias of Facebook. If they were told which media company they were working for, they could expect harsh sanctions (fines, threats). Some companies also employ security staff to monitor their employees to make sure the contract is silent.

When you have to see the darkest things on the internet all day so no

Not all moderators in the film reveal their identities, as at the time of filming, some were still working in their jobs. However, the fluctuation is huge due to the mental burden, responsibility and burnout, which is completely disproportionate to the salary. Nevertheless, many people choose this profession because of money, as there is no better alternative.

An average moderator works eight hours a day, during which time he has to check 25,000 pictures and videos. This means you have about eight seconds to make a decision about an image or a video. You have two choices: delete or allow free access to the content.

When you have to see the darkest things on the internet all day so no

You do the work alone, so in dilemma situations you can’t seek the advice of your peers or your supervisor. It’s also a huge responsibility, because as it’s said in the film, a bad decision can cost lives. Often, it’s not just a few human lives at stake: on second thought, a post can start a war, trigger harassment, or lead to crime.                                               

Do the dirty work

At the beginning of the documentary, several moderators summarize the essence of their work. In general, their job is to make the Internet safer for users. They seek to create and maintain a virtual space where people can enjoy online content without feeling threatened. For example, they don’t have to worry about encountering images of executions, terrorist slogans, or pornographic videos while browsing the feed. As one moderator puts it, without them, the Internet would be in chaos. In fact, they are the cops of the internet who protect the users. Their job is to delete images, text, and videos that violate the principles of social media use. They take action against terrorism, cyberbullying and child . A moderator simply declares their work to be “to clean up the filth.” Much of the content awaiting screening comes from Europe and the United States, so it is not really an exaggeration to say that they are sorting waste from foreign countries, which sometimes know very little about their cultural, social and political background. They need to be highly focused in their work to make a good decision about the fate of an image or video in the eight seconds available.

It may be surprising how strongly they carry out their work with a sense of mission despite unfavorable working conditions and a lack of respect. We can hear from almost every speaking moderator that they value their work and feel responsible for providing a “healthy” environment for Internet users. The film also features a young girl who recounts how she became a moderator. In the background, we can see children and adults rummaging through the street in huge piles of rubbish. She says her parents were scared that if she doesn’t study well, she will do the same. So he was a good student, but wages are very low in this country. Working as a moderator can be a way out of deep poverty for them.

When you have to see the darkest things on the internet all day so no

In fact, the amount of money they receive for their work is extremely low by European or American standards, which is one of the main reasons why they work with a third world workforce. Here you will always find people who embark on this self-sacrificing mission in the hope of a better future.

Psychological harms

It is said in the film that each moderator is affected differently by what he sees. Some are unable to erase disturbing sequences of images from their memory, while others divert their attention from it after their work hours expire and focus on positive things. One moderator at this point says this work mentally destroys them because it instills in their brains the idea that violence is a normal, ordinary thing. After a while, they get used to the sight of bomb attacks where hands and feet fly as if it were a natural part of life to kill people. The moderator may choose not to view an image or video, but this will appear as an error in the system during quality control. You can make only three such mistakes in a month. Team leaders control only three percent of the moderators ’work.

As stated in the film, due to the technology profile, large media companies employ primarily IT professionals, engineers and technicians whose main activities are content creation, user experience testing, and marketing. If there is a problem with the created content and platforms, the primary solution for them is an algorithm that filters the content according to the regulations. If that wasn’t enough, they would already feel uncomfortable. In the film, we can see details from negotiations in which spokespersons from major media companies are asked about moderation principles. These details show that companies are reluctant to talk about cleaning work outsourced to Southeast Asia, and the responsibility is also shifted to the owner companies. Moderators also do not pay enough attention to their mental health: they only have the opportunity to use the psychological help provided by the company once every three months. As many note, they are sacrificing their mental health to make the Internet safer for others, while for them it is cyberspace that is the most threatening threat.

When you have to see the darkest things on the internet all day so no

In the film, a young girl recounts that when she was confronted on the first day with what she could expect in the course of her work, she wanted to quit immediately. He recalls that a recording of child molestation shook him so much that he told his boss he couldn’t take it anymore. His superior replied that he could not quit since the contract had been signed.

A middle-aged man says they shouldn’t let the videos affect them. He recounts that one of his colleagues did not go to work one day. He began to worry about him, so he went to his apartment where he found him dead. He hanged himself. His colleague sees the case as a detrimental effect of their work, adding that his fellow suicide escorted as a moderator specializing in self-harming behaviors (e.g., self-harm, suicide). He watched thousands of these pictures and videos every day. The company he worked for kept the case a top secret.

A dilemmahelyzetek nyomása

Often, moderators can really rely on their own principles and judgment when deciding whether a piece of content is in line with community principles. The film features painter Illma Gore, who stirred a big storm with Donald Trump’s act on U.S. President. The painting received three million shares in one social site in a short time, but was soon removed from there. One moderator explains that the work of art was deleted not primarily because of its sexual content, but because it despises the person of the president. This is also indicated by the fact that shortly after the appearance of the picture, Trump’s physical endowments became the focus of press attention. However, Illma Gore disagreed with the moderators’ decision. According to him, painting is not equivalent to a photograph, as it does not show reality, but the point of view of the artist. In this sense, the painting depicts the human body, so in his opinion, it is neither sexual nor violent, which would justify its deletion.

Also a dilemma was presented to the moderators by a war photo showing fleeing children. A moderator says the image could also be one of the iconic photographs of the Vietnam War, but one of the little girls is not wearing clothes on it, so they had to delete it as a rule. They did the same when uploading a picture of a child victim of a devastating tsunami to the community site, which was later reworked by a German artist to share with the public.

When you have to see the darkest things on the internet all day so no

Moderators also have no easy task when it comes to deciding on dubious political situations or images of war. Their work can be crucial for the inhabitants of a distant country unknown to them. Recordings of terrorist actions may not be posted on social media sites, but how do they know if the fighters rushing with guns are terrorists or civilians fighting for their freedom? Because of such difficult-to-judge situations, moderator work requires detailed political and cultural knowledge, but most moderators are young, uninformed about the historical background of Western countries, the current political situation, and only have a few seconds to decide on a war image. , whether it is news or an act of terrorism. At the beginning of their work, they should therefore note the main terrorist slogans, flags. Recordings of a terrorist offense cannot be posted on social media sites, but they are not automatically deleted, as they are important documentation of the method of the crime, the victims and the damage caused. These recordings are handed over to the authorities, who are of great help as they can get a more accurate picture of what happened and also provide important information for location.

The current political situation may also significantly affect moderation principles in some countries. Turkey, for example, applies strict censorship, so footage of the burning of the Turkish flag or cartoons depicting President Erdogan are immediately deleted, even though the sharing of such content does not violate the rules of the Community. However, according to the Turkish authority, yes, so users are being asked to delete the content. The government has also informed social network operators that this is an illegal act in their country. In this case, companies can decide whether they consider it legal or illegal to share this content, but most of the time they give in to the pressure of power. Failure to do so would result in the loss of their users. This is why IP address filtering has recently been introduced. This prevents Turkish internet users from seeing the anti-government movements as well, while making them visible to users of other nationalities.

Is there a way out?

Overall, we can see that there is a lot of pressure on moderators. Inexperienced, young people who know little about the cultural values ​​and political views of the Western world do the cleansing, all in secret, alone, under a terrible spiritual strain. They have a great responsibility, yet they cannot count on the supervision and support of their employers or the sympathy of their families for their duty of confidentiality, and psychological help is rarely provided to them by their workplace. Radical changes would be needed in terms of their working conditions, but they are in an extremely vulnerable position, so their opportunities are limited. By drawing their attention to them, we hope to sensitize not only users of social networking sites to the problems of moderators, but also decision-makers,

When you have to see the darkest things on the internet all day so no

The article was written by psychologist Ágnes Zsila, whose main area of ​​interest is the meeting of media psychology and pop culture.

Extra Life is a social responsibility column on SamaGame that aims to help young people and parents work with professionals to talk about issues that may affect you.