How Russia Used Social Media Marketing Techniques for Propaganda

Jonathan Albright, a professor of communications at Elon University, has written a massive analysis of how Russian social media accounts are still being undercounted and probably under-analyzed. To summarize his post very quickly: there is an additional ecosystem of Russian propaganda on services like Instagram that have not been included in most discussions of Russia’s activities on the “Big Three” digital media companies (Twitter, Facebook, Google), and that ecosystem has an alarming staying power even after the accounts are deleted.

So it got me to thinking: what Russia pulled off in the 2016 election was, essentially, an incredibly successful guerrilla marketing campaign. In other words, what the Russian intelligence services did was to coopt the techniques and infrastructure that companies use to sell us stuff, and they applied it in a very creative and unexpected way in an effort to influence the outcome of the election (and American politics in general).

It maybe was not totally unexpected. Russia watchers knew for a while that there was a concerted effort to mobilize social media to sow chaos. In 2015, Adrian Chen documented how “troll accounts” operated from a dedicated agency in St. Petersburg, Russia, fabricated a chemicals manufacturing plant explosion in Louisiana to map out how and under what circumstances people would share breaking news on Twitter, YouTube, and elsewhere. And in 2014, Peter Pomerantsev explained how Russia had “weaponized absurdity and unreality” in an effort to both justify its illegal annexation of Crimea and to disrupt western efforts to support Ukraine against its illegal military incursions into the Donbas region. There are other examples, too, but it is important to acknowledge that Russia using social media to disrupt the news was not a brand new phenomenon during the 2016 presidential election.

What made the Russian effort so effective was how they managed to target specific weak points in American society and amplify those weaknesses into fracture points. I discussed some of those in a piece last year analyzing the repeated email hacks against Democrats. “From a policy perspective, it is far from clear how to defend citizens and private organizations against a sophisticated attack on their private correspondence that will be used for a propaganda campaign during an election,” I wrote at the time.

The way Russia weaponized information for the election was multifaceted and that is probably why it is so hard to really understand. But one thread running through all of these different attack vectors was an almost surgical precision in identifying and exploiting weaknesses in how our society functions through marketing techniques.

Take the emails hacks, something which has faded from memory while everyone jumped over themselves to get angry at Facebook. Lots of private correspondence was hacked and leaked, and most of it was meaningless drivel — from John Podesta’s recipe for risotto to Colin Powell calling Hillary Clinton “not transformational.” It was all gossipy nonsense, but the Washington DC press corps is famously scandal-hungry, and it did not take much thinking to realize that handing them a bunch of insider gossip would prove irresistible. (The email story becomes more ominous when one ponders that the Trump campaign had insider knowledge months before it broke, but that is a discussion for another time.)

Back to the marketing: so Russia stole thousands of emails and distributed them to a press that they knew would amplify and scandalize them far out of proportion to anything that actually existed. The growth of an independent right-wing media ecosystem, established in opposition to more politically neutral traditional media outlets, presented another opportunity for Russian marketing to gain a foothold. A Harvard study noted:

We find that the structure and composition of media on the right and left are quite different. The leading media on the right and left are rooted in different traditions and journalistic practices. On the conservative side, more attention was paid to pro-Trump, highly partisan media outlets. On the liberal side, by contrast, the center of gravity was made up largely of long-standing media organizations steeped in the traditions and practices of objective journalism.

This created an fertile environment for amplifying messages designed to trigger emotional responses by partisans mostly, but not entirely, on the right (a Buzzfeed study of several hundred new outlets created for the election found “Of the 667 websites in the database, 490 are conservative and 177 are liberal…the partisan conservative universe is bigger and more established than its counterpart on the left”). This amplification system was exploited most worryingly in an astroturfed protest at a Houston-area Islamic Center, where Russian-linked social media accounts encouraged both anti-Muslim and pro-tolerance groups to show up and confront each other.

In addition, there is a branch of the ongoing FBI investigation into Russian election meddling that is focusing on the role right-wing media websites played in amplifying Russian messaging — whether knowingly or unknowingly:

The bots’ end products were largely millions of Twitter and Facebook posts carrying links to stories on conservative internet sites such as Breitbart News and InfoWars, as well as on the Kremlin-backed RT News and Sputnik News, the sources said. Some of the stories were false or mixed fact and fiction, said the sources, who spoke on condition of anonymity because the bot attacks are part of an FBI-led investigation into a multifaceted Russian operation to influence last year’s elections.

There are, seemingly, endless examples of how Russia tricked the polarized media system into amplifying certain messages, but this same ecosystem also was a fertile ground for original content that could be amplified over social networks without any input from media organizations.

In communications and public relations, it can sometimes be difficult to measure the return on investment for social media spending. Businesses try to correlate purchasing behavior to various engagement metrics — click-through, re-sharing, viewing time per ad, and so on. For non-profits, it is measured somewhat differently, whether in donation behavior or perhaps in newsletter subscriptions. The point is, there are metrics for understanding a specific outcome you want from a social campaign, and you can analyze the different ways that internet users interact with your content to see where and how effective posts generate expected outcomes — even if, most of the time, they are correlative and cannot be shown conclusively (this is true of all marketing in general).

But with information operations and propaganda, it is much more difficult to understand how an outcome is tied to an effort, because the outcomes are not always measurable. When it comes to Russia, one can reasonably conclude that they were more interested in “sowing chaos,” as much as any electoral outcome (though there is also evidence that Russia specifically wanted Trump to win). It isn’t clear how you would measure that, though it is also clear that much chaos was sown.

Despite the murkiness of some Russian goals, we have a broad sense of what they paid for ads, and can backtrack what sort of exposure they got. Let’s assume a generic set of Facebook engagement metrics, based on a normal type of marketing push: $0.50 per page like, $0.60 per thousand impressions, and $0.50 per click. Facebook has not released click-level engagement data for those ads, but we can guess that the 10 million people who saw paid Russian content is a reasonable guess of their impressions metric. By most accounts the “Internet Research Agency” in St. Petersburg spent about $100,000 on Facebook ads, which would mean they could have reasonably expected around 6 million or so impressions. In other words, they got exceptional value from their ad campaign.

But on social media, organic reach (that is, content reaching people through sharing, algorithmic reordering of a timeline, and so on) is often far larger than paid reach. Facebook estimates that around 126 million people viewed Russian content organically, which is a level of reach that is simply unprecedented in political advertising. It, too, represents an exceptional level of value considering the relatively small amount of money paid out for sponsored posts — especially considering how Facebook has been altering its algorithms to reduce organically viewed content.

We remain frustratingly in the dark about how Facebook is measuring these things for Congress and for the public. The scourge of “fake likes,” whereby shady marketing groups generate fake users to click “like” on a bunch of pages in order to drive up page ranking, remains unsolved in the world of Facebook marketing. And organic reach has been fading on Facebook as they try to push people toward paying for content to spread, rather than waiting for it to be shared and re-upped algorithmically. So for Russia to have acquired such enormous viewer numbers for the amount of money it spent is truly remarkable — it represents a degree of both knowledge and effort that would be highly marketable at a PR firm.

Facebook is, of course, not alone in this. Russia spent far more money on Twitter advertising, which is more freewheeling and much less moderated and filtered than Facebook. Twitter was also more active in soliciting Russian content, and offered state propaganda channel RT 15% of its total share of US election advertising (they have since rescinded permission for RT or the Sputnik channel to purchase ads). But the effects of engagement on Twitter are even harder to measure than on Facebook, especially considering the extremely high number of accounts that are just automated bots and not actual people. It is unclear if Twitter’s Russian content was as instrumental as Facebook’s, though it seems to have continued as Russian accounts expand their support of alt-right and white supremacist groups.

Back to Professor Albright: we are still dramatically undercounting the extent of Russian activity on social media during the election. But we can at least start to unravel a method of operation: Russia did not invent a new form of propaganda for 2016, but rather Russian intelligence services seem to have treated their effort to disrupt the 2016 election as a guerrilla marketing campaign on social media. They employed the tools, techniques, and metrics of the social marketing industry and deployed it against our political system, with unexpected, and extremely difficult to unravel, effects.

Looking at the 2018 midterm elections, it remains to be seen if they will be able to do that again. Facebook was much more proactive in preventing Russian disruption of the French and German elections, and one hopes it would do so for the US elections as well. But European speech rights are more proscribed than they are in the US — in France, for example, the media cannot cover the election during the weekend when the election takes place. In Germany, the government imposes heavy fines on social media companies that do not quickly remove hate speech. The first amendment prohibits such laws in the US, which still leaves us with vulnerabilities that can be exploited. One can hope that the social media industry will take seriously its pledge to safeguard their services against propaganda and malicious disinformation, but, as one might also say, hope is not a strategy.

Joshua Foust is a writer and analyst who studies foreign policy.