The term fake news is used by many but misunderstood. It has gained notoriety as a term of derision from political figures about mainstream media outlets. But when you look closer, you can see there are many other forms that are much more subtle and far more dangerous. The public relations firm Ogilvy wrote about several different types of fake news (satire, misinformation, sloppy reporting and purposely deceptive).
But that really doesn’t help matters, especially in the modern era of state-sponsored fake news. We used to call this propaganda back when I was growing up. To better understand this modern context, I suggest you examine two new reports that present a more deliberate analysis and discussion:
- The first is by Renee Diresta and Shelby Grossman for Stanford University’s Internet Observatory project called Potemkin Pages and Personas, Assessing GRU Online Operations. It documents two methods of Russia’s intelligence agency commonly called the GRU, narrative laundering and hack-and-leaking false data. I’ll get into these methods in a moment. For those of you that don’t know the reference, Potemkin means a fake village that was built in the late 1700’s to impress a Russian monarch who would pass by a region and fooled into thinking there were actual people living there. It was a stage set with facades and actors dressed as inhabitants.
- The second report is titled Simulated media assets: local news from Vlad Shevtsov, a Russian researcher who has investigated several seemingly legit local news sites in Albany, New York (shown below) and Edmonton, Alberta. These sites constructed their news pages out of evergreen articles and other service pieces that have attracted millions of page views, according to analytics. Yet they have curious characteristics, such as being viewed almost completely from mobile sources outside their local geographic area.
Taken together, this shows a more subtle trend towards how “news” can be manipulated and shaped by government spies and criminals. Last month I wrote about Facebook and disinformation-based political campaigns. Since then Twitter announced they were ending all political advertising. But the focus on fake news in the political sphere is a distraction. What we should understand is that the entire notion of how news is being created and consumed is undergoing a major transition. It means we have to be a lot more skeptical of what news items are being shared in our social feeds and how we obtain facts. Move over Snopes.com, we need a completely new set of tools to vet the truth.
Let’s first look at the Shevtsov report on the criminal-based news sites, for that is really the only way to think about them. These are just digital Potemkin villages: they look like real local news sites, but are just containers to be used by bots to generate clicks and ad revenue. Buzzfeed’s Craig Silverman provides a larger context in his analysis here. These sites gather traffic quickly, stick around for a year or so, and then fade away, after generating millions of dollars in ad revenues. They take advantage of legitimate ad serving operations, including Google’s AdSense, and quirks in the organic search algorithms that feed them traffic.
This is a more insidious problem than seeing a couple of misleading articles in your social news feed for one reason: the operators of these sites aren’t trying to make some political statement. They just want to make money. They aren’t trying to fool real readers: indeed, these sites probably have few actual carbon life forms that are sitting at keyboards.
The second report from Stanford is also chilling It documents the efforts of the GRU to misinform and mislead, using two methods.
— narrative laundering. This makes something into a fact by repetition through legit-sounding news sources that are also constructs of the GRU operatives. This has gotten more sophisticated since another Russian effort led by the Internet Research Agency (IRA) was uncovered during the Mueller report. That entity (which was also state-sponsored) specialized in launching social media sock puppets and creating avatars and fake accounts. The methods used by the GRU involved creating Facebook pages that look like think tanks and other media outlets. These “provided a home for original content on conflicts and politics around the world and a primary affiliation for sock puppet personas.” In essence, what the GRU is doing is “laundering” their puppets through six affiliated media front pages. The researchers identified Inside Syria Media Center, Crna Gora News Agency, Nbenegroup.com, The Informer, World News Observer, and Victory for Peace as being run by the GRU, where their posts would be subsequently picked up by lazy or uncritical news sites.
What is interesting though is that the GRU wasn’t very thorough about creating these pages. Most of the original Facebook posts had no engagements whatsoever. “The GRU appears not to have done even the bare minimum to achieve peer-to-peer virality, with the exception of some Twitter networking, despite its sustained presence on Facebook. However, the campaigns were successful at placing stories from multiple fake personas throughout the alternative media ecosystem.” A good example of how the researchers figured all this out was how they tracked down who really was behind the Jelena Rakocevic/Jelena Rakcevic persona. “She” is really a fake operative that purports to be a journalist with bylines on various digital news sites. In real life, she is a biology professor in Montenegro with a listed phone number for a Mercedes dealership.
— hack-and-leak capabilities. We are now sadly familiar with the various leak sites that have become popular across the interwebs. These benefitted from some narrative laundering as well. The GRU got Wikileaks and various mainstream US media to pick up on their stories, making their operations more effective. What is interesting about the GRU methods is that they differed from those attributed to the IRA “They used a more modern form of memetic propaganda—concise messaging, visuals with high virality potential, and provocative, edgy humor—rather than the narrative propaganda (long-form persuasive essays and geopolitical analysis) that is most prevalent in the GRU material.”
So what are you gonna do to become more critical? Librarians have been on the front lines of vetting fake news for years. Lyena Chavez of Merrimack College has four easy “tells” that she often sees:
- The facts aren’t verifiable from the alleged sources quoted.
- The story isn’t published in other credible news sources, although we have seen how the GRU can launder the story and make it more credible.
- The author doesn’t have appropriate credentials or experience.
- The story has an emotional appeal, rather than logic.
One document that is useful (and probably a lot more work than you signed up for) is this collection from her colleague at Merrimack Professor Melissa Zimdars. She has tips and various open source methods and sites that can help you in your own news vetting. If you want more, take a look at an entire curriculum that the Stony Brook J-school has assembled.
Finally, here are some tools from Buzzfeed reporter Jane Lytvynenko, who has collected them to vet her own stories.
- Reverse image search: Tineye.com
- Image metadata reader: regex.info/exif.cgi
- Image forensics: https://fotoforensics.com/
- Video verification browser plug-in: InVid
- Twitter analytics: foller.me
- Facebook search Github project: github.io/fb-search
- Website stats: https://spyonweb.com/
- Content monitoring: BuzzSumo and Crowdtangle
- Reverse AdSense search from DNSlytics
- Whois data (to verify domain ownership)
This from Paula Dunne — Two other forms of important misrepresentation and negligence in reporting —
NOT reporting certain news (or minimizing it) because it won’t fit an overall narrative; and
NOT calling on sources that you know aren’t likely to support your narrative, or who’ll give a completely opposite view.
Both of these are more nefarious, I believe, than some of the other types of fake news for several reasons.
That’s why when we tune in to the opposite—left or right leaning—media we feel like we’re living in a parallel universe. We can’t count on mainstream media to give us both sides of the story any longer, and it’s obvious they don’t believe we’re smart enough to decide for ourselves.
I’m glad to see you writing about this, because people like us in/around the business have a responsibility to call out these issues or we become complicit.
Here are links to various documents about Russian meddling in our 2016 election, for reference:
— The DNI finding in Jan 2017 that they did indeed interfere. https://www.dni.gov/files/documents/ICA_2017_01.pdf
— A NYTimes article deep dive into the inaccurate claims that Ukraine was involved, that CrowdStrike’s servers were located in Ukraine, and other mythology:
https://www.nytimes.com/2019/10/03/us/politics/trump-ukraine-conspiracy.html
— A new study about misinformation on Twitter for the 2018 election:
https://misinforeview.hks.harvard.edu/article/russian-disinformation-campaigns-on-twitter/