The sordid underworld of social media – exposed

By MiNDFOOD

The sordid underworld of social media – exposed

A new documentary, ‘The Cleaners’ explores the dark underworld of social media and how censorship effects what we see and feel.

MiNDFOOD spoke with co-director Hans Block and Moritz Riesewieck about their shocking discovery of how explicit content is filtered out by ‘cleaners’ from your Facebook news feed.

What research went into making the film:

We really wanted to get in contact with these workers doing that kind of job, because for us this is an interesting work on several levels. On the one hand, we envisioned this job as extremely stressful, and on the other hand, these young people determine what can be seen in our digital world and what not. But getting in touch with the workers in Manila was the biggest challenge of the whole filmmaking process. It is a very secretive and hidden industry. The companies do everything to keep the work secret. The companies use codewords to hide for which companies they are working for. Facebook for example is called the “Honeybadger Project”. Workers are never allowed to say that they work for Facebook. Whenever they are asked, they have to say that they are working for the “Honeybadger Project”. The workers must keep their job secret. Otherwise, they are sued. There are private security firms that pressure workers not to talk to strangers. They control their social media accounts and there are all kinds of reprisals. It took us quite a while to get in touch with the workers. We collected all the information to get a bigger picture about what is going on there. But when we eventually did it (in collaboration with a network of locals) we were surprised at how proud many of them were to do this job. They told us: Without us, social media would be a complete mess. On the other hand, many of them silently suffer from the impact the work has on their mental health. After a very long period of research, in which we built up a relationship of trust with the workers, we worked together with our protagonists of the film to find out how to make a film about this work.

How do these ‘cleaners’ deal with the emotional effects of seeing and processing this harmful content?

The symptoms content moderators often face as a result of their daily work are similar to the posttraumatic stress disorders soldiers suffer from who come back from war. Is it any wonder content moderators who see rape videos and other kinds of sexual violence for 8-10 hours per day are not interested in sex at all anymore when they come home at the end of the day? Is it any wonder content moderators who have seen thousands of beheadings cannot trust in other humans anymore, lose all their social relationships, develop sleeping or eating disorders? Is it any wonder there is a seriously increased suicide rate among content moderators if they have to handle all kinds of self-harm-videos? By contrast, it’s forbidden for the young workers to talk about what they see and experience at work – even to their friends and families. Psychologists both from Manila and Berlin told us that’s what makes it worst: If traumatized humans are not allowed to verbalize their horrible experiences.

Why do you think this is outsourced to developing countries?

90% of people in the Philippines are strictly Catholic. To sacrifice oneself for the good cause is part of the culture. Because of this, some workers feel like Jesus, who has also sacrificed himself for the sins of the world. This is a strong narrative that shapes the workers. And then there is an ideological framework for the moderators to appeal. For a while, a new president has been in office: Rodrigo Duterte. He stands for cleaning up society. Social cleansing. By that, he means that all criminals or drug addicts should be made invisible. In reality, that means that tens of thousands of people have been killed there. All problems are simply made invisible instead of resolving them. It’s also about the ideological correlations, for instance between a policy of social cleansing that has become socially acceptable again worldwide and the mandate of the content moderators to keep the platforms “healthy.” How much space is left for grey areas, for otherness and minorities, when many of the content moderators hold down their job with missionary eagerness and aim to battle all that is “sinful” in this world? Within a few seconds a post is either accepted (“ignore”) or removed (“delete”). In cases of doubt often their gut feeling decides. Only a very low percentage of the moderators’ decisions is double-checked by their supervisors. “Don‘t overthink” is one of the first rules every content moderator learns. It becomes obvious why content disappears on a regular basis. With our documentary we want to stimulate a debate that is long overdue: Almost 15 years after their invention, social networks have come to be both a powerful and dangerous tool that is capable of dividing societies, excluding minorities and promoting genocide. We want to bring into focus where our societies are heading if we leave the responsibility for the digital public sphere to private companies that turn outrage and collective uproar into money and despite all lip service don’t take genuine efforts against these developments. We want to show that it’s no coincidence that the political developments worldwide facilitate the elimination and exclusion of everything that “disturbs,” rather than deal with the underlying problems. This ideology is gaining consent all around the globe, analog and digital, and it’s our duty to stop it before it’s too late. We can’t longer afford the convenience to outsource every form of responsibility. The question of democracy and freedom of speech must not only have the two options: Delete or ignore. 

What do you hope to achieve with this documentary?

Almost 15 years after their invention social networks have come to be both a powerful and dangerous tool that is capable of dividing societies, excluding minorities and promoting genocide. We want to bring into focus, where our societies are heading, if we leave the responsibility for the digital public sphere to private companies that turn outrage and collective uproar into money and despite all lip service don’t take genuine efforts against these developments. We want to show that it’s no coincidence that the political developments worldwide facilitate the elimination and exclusion of everything that “disturbs” rather than deal with the underlying problems.

We need to be aware that this is not a “regrettable mistake” that can be resolved quickly. The architecture of social media is built in that way: Every extreme leads to attention in the form of clicks, likes and views. The more extreme and sensational a text, image or video is, the more it will attract attention. And that is exactly what the platforms want because “likes” and “shares” can be converted into money in the form of advertising. So, do not be surprised if fake news are shared and spread thousands of times in a matter of seconds with simple, lurid headlines. Do not be surprised if hatred and violence on social media grow. All this is related to the business model of the company. When these companies tell us that the platforms are abused for such content, this is simply wrong, because it’s the architecture of the platforms that seeks the most extreme content.

Nonetheless, we should not forget the utopian potential of these networks. Facebook & others create new utopian opportunities. Suddenly, people around the world can network, communicate and organize themselves across borders. On the other hand, they create unprecedented new problems, because who or what does not occur here is simply non-existent for billions of people. So who decides what we see and what we do not, what we think and what we should not think about? On the basis of multiple examples, our documentary shows how the deletion of posts and the blocking of accounts can have severe consequences. Most of the times it’s critical voices that are being silenced by the non-transparent decisions to delete certain content. At the same time populists and terrorists are misusing those platforms in order to recruit new members or stir up hatred against minorities. 

Zuckerberg and others claim they want to make the world more open and connected. By our investigations about social media and their policies, we discovered quite the opposite: a completely intransparent, secretive industry, a code of silence around any kind of problems and mistakes. Those companies do everything they can to hide the fact they undermine democracy. Anybody who lets the public know how things are handled there faces serious reprisals. All that reminds us more on the way citizens are treated in dictatorships than on an open and connected world. 

I would like to add something about the architecture of social platforms. The architecture is built in that way: Every extreme leads to attention in the form of clicks, likes and views. The more extreme and sensational a text, image or video is, the more it will attract attention. And that is exactly what the platforms want, because “likes” and “shares” can be converted into money in the form of advertising. So, do not be surprised if fake news are shared and spread thousands of times in a matter of seconds with simple, extreme headlines. Do not be surprised if hatred and violence on social media grow. All this is related to the business model of the company. When these companies tell us that the platforms are abused for such content, this is simply wrong, because it’s the architecture of the platforms that seeks the most extreme content. 

We often have the feeling that technology is neutral, but that’s not the case.

Get a copy of the November issue of MiNDFOOD magazine for the full article.

SHARE THIS ARTICLE

Print Recipe

BECOME A MiNDFOOD SUBSCRIBER TODAY

Let us keep you up to date with our weekly MiNDFOOD e-newsletters which include the weekly menu plan, health and news updates or tempt your taste buds with the MiNDFOOD Daily Recipe. 

Member Login