How Facebook’s Big Brother teams monitor you
Security is tight at the brick building on the western edge of Berlin. Inside, a sign warn that ‘everybody without a badge is a potential spy!’ Spread over five floors, hundreds of men and women sit in rows of six scanning their computer screens. The New York Times reports that they are the agents of Facebook. And they have the power to decide what is free speech and what is hate speech. The deletion centre, one of Facebook’s largest, has more than 1 200 content moderators. They are cleaning up content – from terrorist propaganda to Nazi symbols to child abuse – that violates the law or the company’s community standards. Germany, home to a tough new online hate speech law, has become a laboratory for one of the most pressing issues for governments today: how and whether to regulate the world’s biggest social network. Around the world, Facebook and other social networking platforms are facing a backlash over their failures to safeguard privacy, disinformation campaigns and the digital reach of hate groups. As the world confronts these rising forces, Europe, and Germany in particular, have emerged as the de facto regulators of the industry, exerting influence beyond their own borders.