How disinformation is a major symptom of a sick democracy
As dirty tactics evolve between election seasons, more is at stake other than which candidate wins
Sep 12, 2019
Our long-time helper was active in the last election season.
Frequenting Facebook, cycling between groups and posts, she involved herself in online political discourse. Over dinner she would recount heated debates, some bordering on the nasty side, occurring over Facebook comment sections in the pages of both the incumbent and aspiring mayor of our city.
Perhaps before, Filipinos could be accused of being politically apathetic, but in the digital media era, political discussions have swung mainstream.
Or perhaps, we were always political, and new media is simply taking an amplifier to a megaphone.
And in a country with megaphones in every street, it’s the quieter alleys that stand out.
– – –
In early August this year, a paper titled “Tracking Digital Disinformation in the 2019 Philippine Midterm Election” came out in the online scholarly journal “New Mandala.” Authored by Jonathan Corpus Ong, Ross Tapsell, and Nicole Curato, it examines the shady evolution of digital black operations in the country.
From defining disinformation, it outlined how the key players and tactics, as well as the economics behind it all, changed since the 2016 national elections. It also cautioned that existing efforts to curb disinformation, while laudable and with noticeable impact, have to quickly adapt and evolve.
Pass the message
Twice this year, I was offered work as a troll by an acquaintance from a Metro Manila local government unit. The first time was months before the May 2019 midterm elections. The second time was last week. At first, I thought they were pulling my leg, but when the offer was repeated, I got curious.
Various news reports and academic papers have explored the shadowy world of trolling, generally defined as deliberate online harassment on people, often by an anonymous, if unverified account. A world shrouded in a mystique similar to cold war spies, what writer would pass up the opportunity to infiltrate and practice a little Gonzo journalism in the supposed troll farms?
However, when I inquired again, I was told that whoever was looking for trolls (for the 2022 national elections) had already hired an entire marketing firm.
– – –
Disinformation, misinformation, fake news. All these things seem synonymous, but Ong, Tapsell, and Curato’s paper defines disinformation as the intent to mislead, while misinformation is more incidental, accidental. The former is deliberate lying, the latter is the mishearing that commonly results from playing pass-the-massage.
The first 18 months of the Duterte presidency, from mid-2016 to the rest of 2017, were a field day for spectacular disinformation. News reports then alleged that the president’s campaign managers invested heavy amounts in troll armies all while misleading photos and videos circulated from government channels, promoting martial law or equating lewd dances to federalism.
In contrast, such high-profile public scandals barely arose in 2018 and 2019.
If anything, the country witnessed the rise and fall of Mocha Uson, the “queen of fake news,” starting from the partisanship of her girl dance group in 2016, to her column in a major broadsheet, to her appointment as Assistant Secretary for Presidential Communications, to her subsequent resignation in order to run in the 2019 midterms, and her failure to win a seat.
Undoubtedly, many saw this as a win for truth.
But by then, the theater of sustained disinformation shifted from public figures and institutions to niche communities. Digital media was already leveraged in the 2016 national elections. Now, campaign managers allot a larger part of their funds towards social media machinery, in both legitimate (official pages, etc.) and under-the-table (troll armies, etc.) tactics.
From back alley internet cafés to legitimate, registered ad agencies and marketing firms, disinformation grows in any climate as the anonymous and fragmented nature of new tactics make it hard to pinpoint any single perpetrator, despite the seeming victories against the Mocha Usons of society.
In light of all this, Facebook’s public policy director Katie Harbath described the Philippines as “patient zero” in the worldwide rise of digital disinformation.
This is because disinformation’s strategic center has shifted from who and what’s publicly visible to more insidious and localized settings: ‘micro-influencers’ and their Instagram followers, lifestyle YouTube vlogs impersonating existing legitimate news outlets, and community leaders paid to rouse specific-interest digital spaces like Overseas Filipino Worker (OFW) and Filipino conspiracy theorist Facebook groups.
“Since then, fake news … has aimed at hacking attention and manipulating conversation at the level of small communities and private groups,” the paper states.
It’s all about walking the talk as “precisely because micro- and nano-influencers (personalities with followers ranging from 1,000–100,000) lack mainstream fame, they appear more authentic and trustworthy to their small yet intensely dedicated followers.” This same mindset carries over to the ‘thought leaders’ in special interest groups and those running impersonator vlogs.
Paid influencers, paid followers
The iceberg goes really deep: disinformation runs on a chain of command, with strategists mapping out operations on top as their lieutenants the digital influencers lead their ‘followers’ at the front lines.
It’s been noted how difficult it is nowadays to tell a fake account (trolls have many) from an ‘organic’ if livid supporter—an actual person—of a particular candidate. While candidates readily disclose statements on hiring professionals to manage their official social media accounts, this doesn’t include supposedly ‘organic’ groups and ‘supporter-created’ pages endorsing them.
The New Mandala paper puts it this way: “The black ops operation is located in an office sometimes not even in the same city, who reports to a public relations (PR) company, who reports to the candidates’ relative or close friend, who may or may not report to the central social media team who in turn may or may not report to the candidate.”
This murky, convoluted system essentially prevents a candidate from being held liable for using disinformation. The authors point out that compared to other countries, the Philippines lacks an accountability system for “political consultancy disclosures.”
Smaller and smaller circles
Does alternative news really deserve its bad rep? Before, the term applied to publications covering stories the mainstream media shied away from. Often center-left, many of these organs were funded by donations instead of ad revenues.
But in recent years, ‘alternative news’ has more commonly been associated with attempts to impersonate news outlets, not so much for humor but for name recall and the off-chance that someone mistypes while browsing (i.e., “t1me magazine” instead of “Time magazine”). Often, disinformation comes out.
Facebook groups are similarly exploited: Admins build off the ‘organic’ emotions of members in order to sustain a political agenda. Where public pages and open groups are easily monitored by Facebook’s security team, the privacy settings of closed groups, such as those for OFWs have been exploited to amplify echo chambers and foster hyperpartisan politics.
Fears of economic conditions in the Philippines, wariness toward the Philippine elite, nostalgia for a perceived golden age, and aspirations for better days end up exploited.
The money trail
When economic considerations factor heavily, ethical questions may factor less.
The New Mandala paper reveals that many respondents who moonlight as trolls do so out of desperation. With basic salaries unable to keep pace with restrictive expenses in urban centers, otherwise good people turn to such work to make ends meet.
Nano-influencers, the lowest-tier influencer in terms of followers, get paid roughly P5,000 to P10,000 per post. Yet macro-influencers, whose followers are in the millions, are vulnerable to public scandal and regulation like other big celebrities. Nano-influencers often fly under the radar.
Thankfully, many small influencer posts in the last election were done without much thought—a post of a candidate’s face in between thirst-trap half-nudes in one’s grid? Their followers were quick to pick up on something contrived.
Trolls/followers meanwhile, who often operate with template messages and comments, get paid an average of around P12,000 a month. Maybe these accounts were the only ones still commenting “nice bod!” on a nano-influencer’s candidate-between-the-beach pics.
And yet this report contradicts others suggesting that the starting pay of a troll is often twice that of a government employee or journalist. Is it possible that respondents to both the New Mandala paper and similar works are deliberately lying on the pay?
Often, trolls work seasonally, and often without a contract—especially as this can be evidence that (finally!) makes official this open secret.
All in all, operations have gone underground to skirt more vigilant above-ground measures, such as content regulation and the increased scrutiny of public figures.
Politicking time bombs
I stick to my college ethics professor’s definition that “politics is about consensus-building.” More than anything else, politics isn’t about the personalities, or who’s right or wrong, but is a process of community-building via addressing common concerns.
Democracy, with its emphasis on accountability and safe public spaces for healthy dissent and discourse—and this now includes digital spaces—is a recent development.
Looking at the bigger picture of political systems across history, where power was either in the hands of families or a small cluster of elites, democratic systems are ideally the answer to entrenched power, despite its application seeing varying degrees of success from society to society.
The more cynical may say that democracy is just a disguised oligarchy, but that’s where systems come into play. And values are what drive systems. The paper identifies “a broader project” by disinformation agents towards “undermining values in society.” Through an oversimplified narrative of us-vs-them, good-vs-evil, “[disinformation] narratives cement divides such that basic principles for deliberative exchange are systematically refused.”
Social media has further democratized the ability of ordinary citizens to call out executive excess, compete with and rectify media framing, and organize mass movements. But this same technology has also somehow enabled a steady erosion of democratic systems.
Perhaps, it’s human ingenuity (for better or worse) rather than social media that’s undermining the presently-best political system.
The prevalence of fake news and disinformation isn’t so much a cause of a sick democracy as it is a symptom of it. After all, hyperpartisanship feeds off existing socio-economic and political anxieties. Here, legitimate sentiments get co-opted into the disinformation cycle. Complex issues, such as drug abuse and relations with China are reduced to simplistic terms: pro- or anti-candidate, party, or country.
Battle lines drawn
Yet the fight goes on as the authors argue for “bespoke” solutions all while cautioning against heavy-handed approaches impinging on privacy and free speech.
Ong, Tapsell, and Curato propose the following:
For one, the advertising and PR industry can expand self-regulatory measures beyond corporate projects and into what’s now termed ‘political consultancy.’
One such example of such a regulatory law—sans the repression practiced in other countries—is the proposed Fair Elections Act, requiring candidates to disclose all election-related digital communications.
Furthermore, it’s hoped that the Commission on Elections expands existing regulations for traditional media to cover digital media, and not just for candidates’ official social media accounts but on “informal work arrangements” such as employing influencers, targeting small online communities, and other social media accounts tied to their campaign.
So far, Facebook has intensified fact-checking efforts, working with Rappler, Vera Files, and Agence France-Presse as third-party fact-checkers. Here, they determine which content should get algorithms making them appear less on (but not erased from) newsfeeds. Of course, these decision-making processes and the feedback loop between developers and fact-checkers must be publicly available.
Cheesy as it sounds, it’s about getting together. Concerned sectors should form alliances built around a systems-based democracy and its attendant principles. Labor concerns in the media and related industries must also be addressed to prevent otherwise well-meaning workers from resorting to black ops jobs.
Already, candidates and their kingmakers are preparing for the 2022 national elections. At this point, it doesn’t matter who wins if democracy loses.
Special thanks to Sarah Torres, co-founder of Out of The Box Media Literacy, for her insights
Get more stories like this by subscribing to our weekly newsletter here.