The revelation that Cambridge Analytica exploited the data of 50mn Facebook profiles to target American voters is indeed frightening. But Cambridge Analytica shouldn’t act as a diversion from the real bad guy in this story: Facebook. It is mystifying that as his company regulates the flow of information to billions of human beings, encouraging certain purchasing habits and opinions, and monitoring people’s interactions, Mark Zuckerberg is invited to give lectures at Harvard without being treated with due scepticism.
We have now reached the point where an unaccountable private corporation is holding detailed data on over a quarter of the world’s population. Zuckerberg and his company have been avoiding responsibility for some time. Governments everywhere need to get serious in how they deal with Facebook.
After trolls were sent to jail for sending threatening messages to the activist Caroline Criado-Perez and MP Stella Creasy, a debate ensued over whether the likes of Facebook and Twitter should be classified as platforms or publishers. Facebook is treated as if it is simply a conduit for information, meaning it is not liable for the content its users share – in the same way that BT can’t be sued when people make threatening phone calls.
In 2014 Iain MacKenzie, a spokesperson for Facebook, said, “Every piece of content on Facebook has an associated ‘report’ option that escalates it to your user operations team for review. Additionally, individuals can block anyone who is harassing them, ensuring they will be unable to interact further. Facebook tackles malicious behaviour through a combination of social mechanisms and technological solutions appropriate for a mass-scale online opportunity.”
But the company is evasive about the number of moderators it employs, how they work, and how decisions are made. It has started taking a firmer line on far-right content – recently removing Britain First pages from the site – but it is still resisting many legislative attempts to regulate its content. What content users then see is decided by an algorithm that can change without any consultation, including with the government or the businesses that rely on Facebook for revenue – meaning that some can be quickly wiped off the map. In February 2018 the website Digiday reported on LittleThings, a four-year-old site that shut down overnight after Facebook decided to prioritise user posts over publisher content. A hundred jobs were lost.
Facebook wasn’t the only contributor to LittleThings’ demise, but those working at the website said there was nowhere else to go after the algorithm change. And this isn’t the only example: in 2013 an algorithm change halved the traffic of viral content website Upworthy – something from which the website has never recovered.
The impact of Facebook’s dominance means that publications are constantly scrambling to keep up with the platform’s changing strategy. The editor-in-chief of Wired, Nick Thompson, recently told the Digiday podcast that there was a fear “Facebook has a dial somewhere that can be turned to cut off media that gets too uppity”.
Much has been made of the fact that Facebook creates “filter bubbles”. It has been criticised for prioritising content that users will like – meaning there is less diversity in the news stories people read – and for failing to crack down on propaganda. In fact, Italy’s new far-right star Matteo Salvini explicitly thanked Facebook for contributing to the country’s recent election results.
All this from a company that in 2016 paid just £5.1mn in corporation tax on its UK operations, despite profit and revenues nearly quadrupling on the back of increased advertising sales. In December 2017, Facebook announced it would start booking advertising revenue in countries where it was earned, instead of rerouting it via Ireland, although – as this newspaper reported – the move is “unlikely to result in it paying much more tax”. This is despite Zuckerberg calling for governments to start paying a universal basic income for every citizen as a response to automation, driven in part by Silicon Valley.
Even if we want to avoid the site and keep our data protected, it’s not as easy as one might think. According to Roger McNamee, an early investor in Facebook, the company uses techniques found in propaganda and casino gambling to foster psychological addiction in its users – such as constant notifications and variable rewards. By keeping us hooked, Facebook is able to hold a huge amount of data on us. What is surprising, and worrying, is the derived data Facebook has – the profiles it can build of its users based on seemingly innocuous information. The author of the book Networks of Control, Wolfie Christl, noted that a patent published by Facebook works out people’s commute times by using location data from mobile apps. It then uses this and other data to segregate users into social classes.
Facebook’s massive data cache goes hand in hand with its acquisition of competitors. Nick Srnicek, author of Platform Capitalism, says, “Facebook is acting like a classic monopoly: it’s buying up competitors like Instagram, it’s blatantly copying rivals like Snapchat, and it even has its own app, Onavo, that acts to warn them of potential threats. All of this is combined with an unchecked sweeping up of our data that’s being used to build an impervious moat around its business.”
If Exxon Mobil attempted to insert itself into every element of our lives like this, there might be a concerted grassroots movement to curb its influence. So perhaps it’s time to start treating Facebook as the giant multinational corporation it is – especially because people with Facebook profiles aren’t the company’s customers: they are the product it sells to advertisers. - Guardian News & Media

Related Story