But it's a misleading phrase — dangerously misleading.
2016 Presidential election
2018 Midterm elections
Agriculture, forestry, and commercial fishing
Business and industry sectors
Business, economy and trade
Continents and regions
Elections (by type)
Elections and campaigns
Forms of government
Government and public administration
Government bodies and offices
Internet and WWW
Political Figures - US
Russia meddling investigation
US Federal elections
US federal government
US Presidential elections
Arts and entertainment
Celebrity and pop culture
Government departments and authorities
International relations and national security
The Russian actors who, through the Internet, interfered with America's 2016 presidential election and are again interfering with our 2018 midterm elections aren't anything like the "trolls" who've been a scourge of the Internet for two decades. Until we understand the difference, we won't understand the threat our democracy is facing.
Let's start with how these things came to be called "troll farms" in the first place. As the extent of Moscow's interference in the 2016 election came to light, the most fascinating element (beyond the possibility of collusion by the Trump team itself) was the idea of hundreds of Russians spending their hours in front of dimly lit computer screens, feeding Americans deliberately polarizing, false stories about our own country. We needed a word for bad actors online.
So, we turned to "trolls." This was familiar lingo to those who'd tracked online behavior since the emergence of the Internet. Many people gave in to the temptation to hide behind the anonymity of the Internet to harass, pester, and bully others — often complete strangers. Those who did were said to be engaged in "trolling." Soon those who picked unwinnable fights and bullied victims for the sheer sake of bullying them became "trolls."
What trolls are — and are not
Why do trolls do it? Professor Whitney Phillips, in a landmark book on trolls, explains that "trolls take perverse joy in ruining complete strangers' days." Ultimately, Phillips writes, "trolls are motivated by what they call lulz, a particular kind of unsympathetic, ambiguous laughter." They're in it for fun, even if a sickening form of fun that comes at others' expense. And trolls are fundamentally disorganized: They act as a flash mob, grouping together spontaneously to troll different targets and then going their own way.
Those whom Moscow employed to interfere with American democracy bear neither of these hallmarks. Quite the opposite. The Russian tweeters, Facebook posters, and YouTube commenters who weaponized social media in 2016 weren't in it for the fun or the laughter.
To the contrary, the individual grunts were in it for the money; and their bosses in the Kremlin were in it to destabilize American democracy and paralyze the United States. What's more, there was serious organization to the effort, with deliberate chains of command, subunits focused on particular messaging themes, careful cultivations of fake personas, and other specific tradecraft and tactics that were repeated and refined. As one former so-called Russian troll told the Washington Post, "My opinions were already written for me."
A deliberate disinformation campaign
This was, all told, a disinformation campaign carefully organized and managed by state security forces. Special Counsel Robert Mueller's indictment of Russia's Internet Research Agency, which was registered with the Russian Government, and related Russian entities with Russian government contracts describes deliberate campaign planning and execution such as sending individuals to American soil to gather intelligence on what would become the campaign's audience; building a computer infrastructure designed to mask the Russian origins of the campaign's messages; coordinating activities with "unwitting individuals associated with the Trump campaign" to maximize the campaign's impact; and ensuring funding for the private firms that were key parts of the campaign's architecture.
The Agency itself was tidily organized into divisions that included a management group, a data analysis department, a graphics department, a search-engine optimization department, an information technology department, and a finance department.
Moreover, a separate indictment obtained by Mueller of 12 Russian intelligence officers for hacking the Democratic National Committee and Hillary Clinton campaign described in detail phishing attacks, money laundering, and the attempted hacking of state elections boards, all in service of the Kremlin's overall election interference campaign and all involving Russian government officials themselves.
To call those who were part of this elaborate architecture "troll farms" is to give entirely the wrong impression by suggesting that the motivation and organization bore some resemblance to lulz-seeking, disorganized Internet hooligans. The ranks of Moscow's social media army were nothing of the sort.
The difference isn't merely semantic. It's conceptual, and it's critical to protecting the health of our democracy. The best way to deal with trolls — real trolls — is to ignore them. That takes away their fun. It stalls their momentum. It leads them to look for other targets — or, better still, to turn on each other.
The opposite is true for a sophisticated state actor like the Kremlin. If Americans ignore what the Russians did to corrupt our democracy in 2016 — as President Trump, sitting in Helsinki face-to-face with Vladimir Putin, insisted we should do — then we're sure to see more of the same from Moscow.
Four ways to fight back
Indeed, US intelligence community chiefs have told us we're seeing more interference already as we head into the 2018 midterm elections. Recognizing Moscow's assault on American democracy as a goal-driven, coordinated activity entirely undeserving of the label "trolling" reveals the elements of an initial response that we still desperately need in place. First is better information-sharing between government and the tech sector on the latest trends and tactics so that both can be prepared to respond more swiftly and effectively.
Second is a set of new laws and regulations that would inject greater transparency into online political advertising. Third is experimentation by the tech sector with more aggressive, proactive ways to prevent disinformation campaigns from infecting their platforms. Fourth is punishment and thus deterrence of the Kremlin itself through tougher financial sanctions and continuing criminal indictments.
Strategic effectiveness begins with conceptual clarity. So, let's stop calling them "troll farms" and start calling them what they are: the Kremlin's disinformation army.