How Russia Used Social Media Marketing Techniques for Propaganda

Jonathan Albright, a pro­fes­sor of com­mu­ni­ca­tions at Elon Uni­ver­si­ty, has writ­ten a mas­sive analy­sis of how Russ­ian social media accounts are still being under­count­ed and prob­a­bly under-ana­lyzed. To sum­ma­rize his post very quick­ly: there is an addi­tion­al ecosys­tem of Russ­ian pro­pa­gan­da on ser­vices like Insta­gram that have not been includ­ed in most dis­cus­sions of Rus­si­a’s activ­i­ties on the “Big Three” dig­i­tal media com­pa­nies (Twit­ter, Face­book, Google), and that ecosys­tem has an alarm­ing stay­ing pow­er even after the accounts are delet­ed.

So it got me to think­ing: what Rus­sia pulled off in the 2016 elec­tion was, essen­tial­ly, an incred­i­bly suc­cess­ful guer­ril­la mar­ket­ing cam­paign. In oth­er words, what the Russ­ian intel­li­gence ser­vices did was to coopt the tech­niques and infra­struc­ture that com­pa­nies use to sell us stuff, and they applied it in a very cre­ative and unex­pect­ed way in an effort to influ­ence the out­come of the elec­tion (and Amer­i­can pol­i­tics in gen­er­al).

It maybe was not total­ly unex­pect­ed. Rus­sia watch­ers knew for a while that there was a con­cert­ed effort to mobi­lize social media to sow chaos. In 2015, Adri­an Chen doc­u­ment­ed how “troll accounts” oper­at­ed from a ded­i­cat­ed agency in St. Peters­burg, Rus­sia, fab­ri­cat­ed a chem­i­cals man­u­fac­tur­ing plant explo­sion in Louisiana to map out how and under what cir­cum­stances peo­ple would share break­ing news on Twit­ter, YouTube, and else­where. And in 2014, Peter Pomer­ant­sev explained how Rus­sia had “weaponized absur­di­ty and unre­al­i­ty” in an effort to both jus­ti­fy its ille­gal annex­a­tion of Crimea and to dis­rupt west­ern efforts to sup­port Ukraine against its ille­gal mil­i­tary incur­sions into the Don­bas region. There are oth­er exam­ples, too, but it is impor­tant to acknowl­edge that Rus­sia using social media to dis­rupt the news was not a brand new phe­nom­e­non dur­ing the 2016 pres­i­den­tial elec­tion.

What made the Russ­ian effort so effec­tive was how they man­aged to tar­get spe­cif­ic weak points in Amer­i­can soci­ety and ampli­fy those weak­ness­es into frac­ture points. I dis­cussed some of those in a piece last year ana­lyz­ing the repeat­ed email hacks against Democ­rats. “From a pol­i­cy per­spec­tive, it is far from clear how to defend cit­i­zens and pri­vate orga­ni­za­tions against a sophis­ti­cat­ed attack on their pri­vate cor­re­spon­dence that will be used for a pro­pa­gan­da cam­paign dur­ing an elec­tion,” I wrote at the time.

The way Rus­sia weaponized infor­ma­tion for the elec­tion was mul­ti­fac­eted and that is prob­a­bly why it is so hard to real­ly under­stand. But one thread run­ning through all of these dif­fer­ent attack vec­tors was an almost sur­gi­cal pre­ci­sion in iden­ti­fy­ing and exploit­ing weak­ness­es in how our soci­ety func­tions through mar­ket­ing tech­niques.

Take the emails hacks, some­thing which has fad­ed from mem­o­ry while every­one jumped over them­selves to get angry at Face­book. Lots of pri­vate cor­re­spon­dence was hacked and leaked, and most of it was mean­ing­less dri­v­el — from John Podesta’s recipe for risot­to to Col­in Pow­ell call­ing Hillary Clin­ton “not trans­for­ma­tion­al.” It was all gos­sipy non­sense, but the Wash­ing­ton DC press corps is famous­ly scan­dal-hun­gry, and it did not take much think­ing to real­ize that hand­ing them a bunch of insid­er gos­sip would prove irre­sistible. (The email sto­ry becomes more omi­nous when one pon­ders that the Trump cam­paign had insid­er knowl­edge months before it broke, but that is a dis­cus­sion for anoth­er time.)

Back to the mar­ket­ing: so Rus­sia stole thou­sands of emails and dis­trib­uted them to a press that they knew would ampli­fy and scan­dal­ize them far out of pro­por­tion to any­thing that actu­al­ly exist­ed. The growth of an inde­pen­dent right-wing media ecosys­tem, estab­lished in oppo­si­tion to more polit­i­cal­ly neu­tral tra­di­tion­al media out­lets, pre­sent­ed anoth­er oppor­tu­ni­ty for Russ­ian mar­ket­ing to gain a foothold. A Har­vard study not­ed:

We find that the struc­ture and com­po­si­tion of media on the right and left are quite dif­fer­ent. The lead­ing media on the right and left are root­ed in dif­fer­ent tra­di­tions and jour­nal­is­tic prac­tices. On the con­ser­v­a­tive side, more atten­tion was paid to pro-Trump, high­ly par­ti­san media out­lets. On the lib­er­al side, by con­trast, the cen­ter of grav­i­ty was made up large­ly of long-stand­ing media orga­ni­za­tions steeped in the tra­di­tions and prac­tices of objec­tive jour­nal­ism.

This cre­at­ed an fer­tile envi­ron­ment for ampli­fy­ing mes­sages designed to trig­ger emo­tion­al respons­es by par­ti­sans most­ly, but not entire­ly, on the right (a Buz­zfeed study of sev­er­al hun­dred new out­lets cre­at­ed for the elec­tion found “Of the 667 web­sites in the data­base, 490 are con­ser­v­a­tive and 177 are liberal…the par­ti­san con­ser­v­a­tive uni­verse is big­ger and more estab­lished than its coun­ter­part on the left”). This ampli­fi­ca­tion sys­tem was exploit­ed most wor­ry­ing­ly in an astro­turfed protest at a Hous­ton-area Islam­ic Cen­ter, where Russ­ian-linked social media accounts encour­aged both anti-Mus­lim and pro-tol­er­ance groups to show up and con­front each oth­er.

In addi­tion, there is a branch of the ongo­ing FBI inves­ti­ga­tion into Russ­ian elec­tion med­dling that is focus­ing on the role right-wing media web­sites played in ampli­fy­ing Russ­ian mes­sag­ing — whether know­ing­ly or unknow­ing­ly:

The bots’ end prod­ucts were large­ly mil­lions of Twit­ter and Face­book posts car­ry­ing links to sto­ries on con­ser­v­a­tive inter­net sites such as Bre­it­bart News and InfoWars, as well as on the Krem­lin-backed RT News and Sput­nik News, the sources said. Some of the sto­ries were false or mixed fact and fic­tion, said the sources, who spoke on con­di­tion of anonymi­ty because the bot attacks are part of an FBI-led inves­ti­ga­tion into a mul­ti­fac­eted Russ­ian oper­a­tion to influ­ence last year’s elec­tions.

There are, seem­ing­ly, end­less exam­ples of how Rus­sia tricked the polar­ized media sys­tem into ampli­fy­ing cer­tain mes­sages, but this same ecosys­tem also was a fer­tile ground for orig­i­nal con­tent that could be ampli­fied over social net­works with­out any input from media orga­ni­za­tions.

In com­mu­ni­ca­tions and pub­lic rela­tions, it can some­times be dif­fi­cult to mea­sure the return on invest­ment for social media spend­ing. Busi­ness­es try to cor­re­late pur­chas­ing behav­ior to var­i­ous engage­ment met­rics — click-through, re-shar­ing, view­ing time per ad, and so on. For non-prof­its, it is mea­sured some­what dif­fer­ent­ly, whether in dona­tion behav­ior or per­haps in newslet­ter sub­scrip­tions. The point is, there are met­rics for under­stand­ing a spe­cif­ic out­come you want from a social cam­paign, and you can ana­lyze the dif­fer­ent ways that inter­net users inter­act with your con­tent to see where and how effec­tive posts gen­er­ate expect­ed out­comes — even if, most of the time, they are cor­rel­a­tive and can­not be shown con­clu­sive­ly (this is true of all mar­ket­ing in gen­er­al).

But with infor­ma­tion oper­a­tions and pro­pa­gan­da, it is much more dif­fi­cult to under­stand how an out­come is tied to an effort, because the out­comes are not always mea­sur­able. When it comes to Rus­sia, one can rea­son­ably con­clude that they were more inter­est­ed in “sow­ing chaos,” as much as any elec­toral out­come (though there is also evi­dence that Rus­sia specif­i­cal­ly want­ed Trump to win). It isn’t clear how you would mea­sure that, though it is also clear that much chaos was sown.

Despite the murk­i­ness of some Russ­ian goals, we have a broad sense of what they paid for ads, and can back­track what sort of expo­sure they got. Let’s assume a gener­ic set of Face­book engage­ment met­rics, based on a nor­mal type of mar­ket­ing push: $0.50 per page like, $0.60 per thou­sand impres­sions, and $0.50 per click. Face­book has not released click-lev­el engage­ment data for those ads, but we can guess that the 10 mil­lion peo­ple who saw paid Russ­ian con­tent is a rea­son­able guess of their impres­sions met­ric. By most accounts the “Inter­net Research Agency” in St. Peters­burg spent about $100,000 on Face­book ads, which would mean they could have rea­son­ably expect­ed around 6 mil­lion or so impres­sions. In oth­er words, they got excep­tion­al val­ue from their ad cam­paign.

But on social media, organ­ic reach (that is, con­tent reach­ing peo­ple through shar­ing, algo­rith­mic reorder­ing of a time­line, and so on) is often far larg­er than paid reach. Face­book esti­mates that around 126 mil­lion peo­ple viewed Russ­ian con­tent organ­i­cal­ly, which is a lev­el of reach that is sim­ply unprece­dent­ed in polit­i­cal adver­tis­ing. It, too, rep­re­sents an excep­tion­al lev­el of val­ue con­sid­er­ing the rel­a­tive­ly small amount of mon­ey paid out for spon­sored posts — espe­cial­ly con­sid­er­ing how Face­book has been alter­ing its algo­rithms to reduce organ­i­cal­ly viewed con­tent.

We remain frus­trat­ing­ly in the dark about how Face­book is mea­sur­ing these things for Con­gress and for the pub­lic. The scourge of “fake likes,” where­by shady mar­ket­ing groups gen­er­ate fake users to click “like” on a bunch of pages in order to dri­ve up page rank­ing, remains unsolved in the world of Face­book mar­ket­ing. And organ­ic reach has been fad­ing on Face­book as they try to push peo­ple toward pay­ing for con­tent to spread, rather than wait­ing for it to be shared and re-upped algo­rith­mi­cal­ly. So for Rus­sia to have acquired such enor­mous view­er num­bers for the amount of mon­ey it spent is tru­ly remark­able — it rep­re­sents a degree of both knowl­edge and effort that would be high­ly mar­ketable at a PR firm.

Face­book is, of course, not alone in this. Rus­sia spent far more mon­ey on Twit­ter adver­tis­ing, which is more free­wheel­ing and much less mod­er­at­ed and fil­tered than Face­book. Twit­ter was also more active in solic­it­ing Russ­ian con­tent, and offered state pro­pa­gan­da chan­nel RT 15% of its total share of US elec­tion adver­tis­ing (they have since rescind­ed per­mis­sion for RT or the Sput­nik chan­nel to pur­chase ads). But the effects of engage­ment on Twit­ter are even hard­er to mea­sure than on Face­book, espe­cial­ly con­sid­er­ing the extreme­ly high num­ber of accounts that are just auto­mat­ed bots and not actu­al peo­ple. It is unclear if Twit­ter’s Russ­ian con­tent was as instru­men­tal as Face­book’s, though it seems to have con­tin­ued as Russ­ian accounts expand their sup­port of alt-right and white suprema­cist groups.

Back to Pro­fes­sor Albright: we are still dra­mat­i­cal­ly under­count­ing the extent of Russ­ian activ­i­ty on social media dur­ing the elec­tion. But we can at least start to unrav­el a method of oper­a­tion: Rus­sia did not invent a new form of pro­pa­gan­da for 2016, but rather Russ­ian intel­li­gence ser­vices seem to have treat­ed their effort to dis­rupt the 2016 elec­tion as a guer­ril­la mar­ket­ing cam­paign on social media. They employed the tools, tech­niques, and met­rics of the social mar­ket­ing indus­try and deployed it against our polit­i­cal sys­tem, with unex­pect­ed, and extreme­ly dif­fi­cult to unrav­el, effects.

Look­ing at the 2018 midterm elec­tions, it remains to be seen if they will be able to do that again. Face­book was much more proac­tive in pre­vent­ing Russ­ian dis­rup­tion of the French and Ger­man elec­tions, and one hopes it would do so for the US elec­tions as well. But Euro­pean speech rights are more pro­scribed than they are in the US — in France, for exam­ple, the media can­not cov­er the elec­tion dur­ing the week­end when the elec­tion takes place. In Ger­many, the gov­ern­ment impos­es heavy fines on social media com­pa­nies that do not quick­ly remove hate speech. The first amend­ment pro­hibits such laws in the US, which still leaves us with vul­ner­a­bil­i­ties that can be exploit­ed. One can hope that the social media indus­try will take seri­ous­ly its pledge to safe­guard their ser­vices against pro­pa­gan­da and mali­cious dis­in­for­ma­tion, but, as one might also say, hope is not a strat­e­gy.

Joshua Foust used to be a foreign policy maven. Now he helps organizations communicate strategically and build audiences.