Social media

Increasingly manipulated forum

Manipulative messaging on social media is hierarchical and not geared to discussing real grievances among peers. Authoritarian populists use it systematically.
“People gaze at their screens the whole day.” picture-alliance/Photoshot “People gaze at their screens the whole day.”

Not even a decade ago, observers spoke of “Facebook revolutions” – for example, in regard to the uprisings of the Arab Spring. Back then, in 2011, social media were new and facilitated communication among equal peers. Governments were not paying much attention, and discontent articulated on Facebook fuelled mass protests in many places.

That was then. In the meantime, Facebook has become one of the most powerful and cheapest propaganda instruments ever to fall in the hands of authoritarian operators. The platform’s infamous algorithms note what kind of content a user likes and serves up more of it, gradually offering ever more radical content. Facebook reinforces conspiracy theories and disinformation. This destructive potential is being exploited by the Duterte administration in the Philippines (also see my essay in Focus section of D+C/E+Z e-Paper 2018/05).

Trolling is very easy and only takes pressing a few buttons. A troll can invent stories, lie and attack at will. There are almost no consequences, impunity is near total. Using a fake account adds a layer of protection. A troll network typically involves tens of thousands of accounts, many of them fake. Some are probably true believers, but many others are paid to follow instructions.

The country is addicted to smartphones. People gaze at their screens the whole day, and Facebook is extremely popular. During the 2016 election campaign, the corporation offered to teach politicians how to use the platform.

In a clever move, Duterte propagandists mobilised millions of overseas Filipino workers (OFWs), whom politicians previously mostly ignored. OFWs turned out to be perfect as amplifiers of disinformation. The expatriates’ being geographically scattered turned out to be a strength because somebody is awake somewhere at any time, ready to engage on Facebook.

The OFWs, however, are not doing all of the propaganda work. Observers who study online disinformation say that many troll networks are run by PR and advertising firms, call centres and even government agencies. Facebook itself has moved at least twice to take down various pro-Duterte Facebook pages – some of them with hundreds of thousands of followers. It accused those pages of “inauthentic behaviour”. In Facebook’s corporate jargon, this does not mean lying. It means that a page is guided by an undisclosed entity, though it looks as though it represents an individual person. So far, occasional purges have hardly hurt the troll networks. More generally speaking, Facebook’s efforts to stop the spread of fake news have mostly proven to be toothless.

Studies (Bradshaw and Howard, 2017, Ong and Cabanes, 2018) suggest that the troll networks are deliberately organised and paid for. They show that Facebook, Twitter, YouTube, Google and even Wikipedia can be gamed. The companies have shown they are either unable or unwilling to stem such organised, manipulative behaviour that is sponsored by powerful special interests, but not geared to egalitarian peer-to-peer interaction.


Links

Bradshaw, S., and Howard, P.N., 2017: Troops, trolls and troublemakers.
https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/07/Troops-Trolls-and-Troublemakers.pdf

Ong, J.C., and Cabanes, J.V.C., 2018: Architects of networked disinformation.
http://newtontechfordev.com/wp-content/uploads/2018/02/ARCHITECTS-OF-NETWORKED-DISINFORMATION-FULL-REPORT.pdf