Who You Gonna Believe? Inside the Strange, Frighteningly Influential Fake News Industry

There’s been a lot of talk about fake news — but who’s writing it? In scattered articles across various sites, the media has begun an investigation, slowly peeling back the curtains to reveal the savvy social media shysters who are filling our feeds with lies.
“I think Donald Trump is in the White House because of me,” one fake news purveyor told the Washington Post on Thursday.
“What do the Amish lobby, gay wedding vans and the ban of the national anthem have in common?” quipped the Post’s lead, answering that “they’re all make-believe — and invented by the same man.”
The 38-year-old Arizonan Paul Horner claimed he’s making around $10,000 a month posting nothing but fake news stories. One laughable headline brought word that “The Amish in America Commit Their Vote To Donald Trump; Guaranteeing Him A Presidential Victory.” But Horner tells the Post that it never, ever occurred to him that Trump might actually win. Ironically at least one of his entirely imaginary articles was tweeted by Donald Trump’s campaign manager, as well as Trump’s son-in-law.
Google News listed his site as ABC News — which is, in fact, its name, though if you look closely, its domain is actually abcnews.com.co. And last Saturday another of his (fake) stories went viral — saying President Obama had declared a second election, to be held on December 19. It was shared over 250,000 times on Facebook, and when asked why he had a clear answer. “Honestly, people are definitely dumber… Nobody fact-checks anything anymore…”
Horner had actually believed that fact-checking would ultimately discredit the whole idea of paid protestors. “I mean that’s how this always works: Someone posts something I write, then they find out it’s false, then they look like idiots.” But instead, people just kept running with his story, and “Looking back, instead of hurting the campaign, I think I helped it. And that feels [bad].”
This week both Google and Facebook announced changes to their advertising programs so they could cut off the cash flow to these Facebook frauds. “The issue of fake news is critical for Google from a business standpoint, as many advertisers do not want their brands to be touted alongside dubious content,” reported Fortune. Horner applauds the efforts, but adds “I just hope they don’t get rid of mine, too.”
But fear not, fake news lovers, he’s already planning ways to continue earning ad money from Google and Facebook — “I have at least 10 sites right now.” Because if they really did shut off all that easy money for fake news stories? “That would suck. I don’t know what I would do.”
The Macedonia Connection
Maybe people will do anything for money — even 6,000 miles away from America. Five days before the election, BuzzFeed traced many fake news sites to unemployed teenagers in Macedonia. Leading up to the 2016 election, the Macedonian town of Veles (population 45,000) “experienced a digital gold rush as locals launched at least 140 US politics websites.” Yes, even USADailyPolitics.com originated from Macedonia — and the largest of their fake news site wound up with hundreds of thousands of followers.
One college student acknowledged to BuzzFeed that “the info in the blogs is bad, false, and misleading but the rationale is that ‘if it gets the people to click on it and engage, then use it,’” They can earn $3,000 in a single day, if a fake story becomes popular enough, and word on the Macedonian street has the top site earning $5,000 each month.
Why not cover Macedonian politics? Because clickthroughs from the U.S. are worth four times as much to advertisers as clickthroughs from other countries, which, as BuzzFeed pointed out, “goes a long way in Veles.” Their only problem now, they point out, is “the market has now become crowded, making it harder to earn money.”
In fact, another BuzzFeed analysis found a high percentage of all the stories on the top political Facebook pages are false or misleading — between 20 percent and 38 percent — and concluded it came down to incentives.
“The best way to attract and grow an audience for political content on the world’s biggest social network is to eschew factual reporting and instead play to partisan biases using false or misleading information that simply tells people what they want to hear,” BuzzFeed reported.
The day after the election, there was a startling post by former Facebook product designer Bobby Goodlatte. “Sadly, News Feed optimizes for engagement. As we’ve learned in this election, bullshit is highly engaging.” Fake news sites are just giving them what they want, and “our news environment incentivizes bullshit.”
One study calculated that 62 percent of Facebook users get news stories from Facebook. And since two-third of Americans use Facebook, they calculate it’s reaching 44 percent of America. And according to a new study by Pew Research. “20 [percent] of social media users say they’ve modified their stance on a social or political issue because of material they saw on social media, and 17 [percent] say social media has helped to change their views about a specific political candidate.”
“An automated army of pro-Donald Trump chatbots overwhelmed similar programs supporting Hillary Clinton five to one in the days leading up to the presidential election, according to a report published Thursday by researchers at Oxford University,” the New York Times reports.
But Fortune recently proposed an alternate theory: that “People don’t share news stories on Facebook because they are true or factual. They share them because they feel true, or because sharing them is a way of signaling membership in a specific cultural group.”
Facebook: The Single Point of Failure
Back in early November, one researcher told the New York Times this significantly changes the criteria for sharing and “creates an ecosystem in which the truth value of the information doesn’t matter. All that matters is whether the information fits in your narrative.”
Harvard’s Neiman Lab summed up the problem. “American political discourse in 2016 seemed to be running on two self-contained, never-overlapping sets of information.” On the day after Election Day, Neiman Lab’s Joshua Benton wrote “the structures of today’s media ecosystem encourage that separation, and do so a little bit more each day.
“The decline of the mass media’s business models; the continued rise of personalized social feeds and the content that spreads easily within them; the hollowing-out of reporting jobs away from the coasts: These are, like the expansion of the universe, pushing us farther apart in all directions…
“There’s plenty of blame to go around, but the list of actors has to start with Facebook…” he wrote, calling Facebook “a single point of failure for civic information.” The day before the election, he spotted at least four fake stories on the Facebook feed for the mayor of Rayne, Louisiana, saying “Facebook has built a platform for the active dispersal of these lies” — and even an economic incentive. We’ve always lived in our own media bubbles, but Benton argues that now our bubbles have been weaponized. “There were just too many people voting in this election because they were infuriated by made-up things they read online.”
Even Edward Snowden weighed in. “To have one company that has enough power to reshape the way we think, I don’t think I need to describe how dangerous that is,” he said at the Fusion Real Future Fair.
Sometimes even the authors of fake stories don’t exist. In October, CNN reported on research that found more than a fifth of the Twitter traffic for both U.S. candidates was being created by bots.
And there’s a darker possibility — that some of the actual comments being left on real news sites may be sponsored by hostile foreign governments. A former correspondent from The Daily Show with Jon Stewart recently investigated what she described as “the troll-industrial complex” — a Russia-funded propaganda unit which, according to the New York Times, is staffed with 400 employees with a budget of roughly $5 million a year. “One employee estimated the operation filled 40 rooms,” the Times reported. On her cable show Full Frontal with Samantha Bee, the reporter joked that while America’s internet trolls are “sad, unemployed people who sit in their basement posting nasty comments online, in Russia, those people get paid by the government.”
Working with a contributor to The New York Times Magazine, Bee flew to Russia and interviewed one of the trolls, who said he had hundreds of fake accounts used to weigh in on the comments section of America’s top news sites — the Wall Street Journal, the Washington Post, the New York Post — and of course, Twitter and Facebook.
Days later, Russia Today, a Russian government-funded website, argued that the whole interview was an elaborate hoax improvised after hearing that the show was looking for Russian trolls. “Because they couldn’t find any, and we were sure they wouldn’t be able too, we thought: ‘OK, we’ll pretend to be trolls.’ So we did. And she fell into her own trap.”
So who you gonna believe now?
One of the most interesting moments in Bee’s interview comes when one of her subjects responded to an accusation that state-sponsored commenters could ultimately change the results of the election. “And the Russians trolls are to blame for that? Maybe people are to blame too. They’re lazy and believe everything they read.”
It’s not going away. This Monday the Washington Post took a look at Prntly.com. “Founded by a former convict named Alex Portelli, Prntly is part of the broad diaspora of websites that takes news about American politics, frames it in a pro-Trump way (often at the expense of accuracy) and then peppers the page with ads.”
The day after the election, Google News was seen highlighting yet another fake article — misstating which candidate had won the popular vote. The Post discovered that Google News had linked to a WordPress site named 70news, which cites as its source a tweet from USASupreme.com, “another random website which doesn’t actually include the numbers themselves,” and which their article notes ” looks an awful lot like Prntly, a made-up news website we looked at earlier this year.”
This expanded the conversation about fake news from just Facebook to Google. Google CEO, Sundar Pichai, told the BBC, “There should just be no situation where fake news gets distributed, so we are all for doing better here.” The BBC reporter asks him several times if fake news could’ve affected America’s election results, especially since many races were close.
They report that he paused for a moment before answering “Sure.”
WebReduce
- How one California city “Wellbeing Project” used data to improve the quality of life.
- Wearable devices from the 1800s included an electric corset.
- How an open source community is like a carnivorous plant.
- Moneyball author Michael Lewis acknowledges the the original data science pioneers, Daniel Kahneman and Amos Tversky.
- How Stephen Wolfram designed a mechanism for interstellar travel in one night.
- Virtual reality may just leave us feeling sad.
Feature image via Pixabay.