The disinformation plague is far bigger than the Russians
Columnist Email the author
At long last, people have stopped asking “Is it really happening?” or “Does it really work?” or “Does it even matter?” Facebook has acknowledged the existence of Russian disinformation on its platform and has finally banned sites created by the Internet Research Agency, the Russian institution dedicated to covert online propaganda. Twitter has removed automated Russian botnets. Hearings and major conferences in France, Britain and Brussels have convened in recent weeks to discuss possible government responses to Russian disinformation campaigns within European democracies, too.
I’ve been to some of the conferences, testified at some of the hearings, and have written about the subject ever since Russia dialed up its propaganda war against the West in 2014, following its invasion of Ukraine. I can hardly object to the increased attention. And yet the belated enthusiasm for exposing Russian manipulation worries me because it underrates the scale of the problem, which isn’t just confined to the Russians.
Thanks in part to the investigation of special counsel Robert S. Mueller III, we’ve all learned exactly how the Russians’ online tactics work. They use fake websites, fake Facebook pages and fake social media followers to give extra credence to extremist views, whether of the far right or the far left. They invent or manipulate stories — lifting them out of context, changing details, creating fake video — with the aim of provoking fear and deepening social divisions. In Germany, Russian trolls, bots and real-life Russian politicians famously pushed the invented story of “Lisa,” a Russian-German girl supposedly raped by Arab migrants. During the French election, Russian state media, supported by pro-Russian social media, promoted the story that Emmanuel Macron was backed by a “gay lobby” in the United States.
But all of these tactics, first used on a large scale by the Russians, are also available to others — and not just other authoritarians. Openly, and legally, they are also used in Western democracies. As a part of its current fearmongering, xenophobic election campaign, the Hungarian ruling party — a member in good standing of the center-right caucus of the European parliament — used a range of platforms to disseminate a whole series of videos that were fake, taken out of context or technologically enhanced, for example with the audio of someone shouting “Allahu akbar” edited in to make a supposed scuffle between Muslims and Christians seem more authentic. Although the Hungarian state (and state-backed) media do also republish and recycle more straightforward Russian material, most of the “Russian-style” material used in Hungary is not foreign; it’s coming from the government itself, and it is then promoted by Hungarian bots and Hungarian trolls.
Fox News and the Trump-friendly media operate in exactly the same way. As I’ve written in the past, Donald Trump openly used Russian slogans and narratives during his 2016 election campaign.
At the moment, though, he doesn’t need to borrow from them anymore. A recent New York Times analysis of how the president came to be obsessed with the “caravan of illegal aliens” listed the ways the original story came to be enhanced and misreported, deliberately, by what we would in another country call pro-regime media. As retold on “Fox & Friends,” or hyped by Frontpage Mag, “Beltway pundit,” and thousands of bots and trolls (both voluntary and professional), the story lost some critical details: that many of the group were refugees from Honduras’s drug wars, or that many planned to stay in Mexico, or that others hope to cross the U.S. border legally to apply for asylum. By the time the tale of the caravan reached the president’s Twitter feed — which has featured faked or mislabeled video in the past, as well — it was an “invasion” requiring the presence of the National Guard.
I repeat: These are Russian tactics. But they are being used by a U.S. president, a Hungarian government and others who are not in power, or not yet. That’s why solutions to disinformation campaigns that focus on Russia alone are insufficient.
Here’s the real challenge faced by all the major platforms: how to re-engineer them to make them more resistant to organizations that, like the Internet Research Agency, engage in what one tech executive calls “coordinated inauthentic activity,” ranging from the use of false names and the creation of false audiences to the publication of false stories and the creation of divisive narratives. Perhaps they will have to limit the use of anonymity, change the algorithms that ensure that the most sensational material spreads the fastest, or institute transparency around video editing tools, especially as these become more sophisticated.
No comments:
Post a Comment