Led by President Donald Trump, Republicans have railed repeatedly against The New York Times, Washington Post and CNN — but, as a group, they trust mainstream sites like those more than “fake news” sites or even highly partisan outlets like Breitbart and InfoWars, according to a Yale study that directly mirrored the methodology being used by Facebook to identify “trusted” sites.
By the same token, Democrats hold a definitively dim view of Fox News, but still rate it as more trustworthy than most fake and hyperpartisan sites.
The Yale study, published on Tuesday, suggests that as Facebook conducts its own surveys, designed to help the massive social network rank its own news feeds, mainstream media organizations could get a boost over fake and highly partisan sites.
The authors of the study — psychologists David Rand and Gordon Pennycook—said their findings suggest that Facebook may have developed a highly effective tool for weeding out false or deliberately misleading content. But the authors warned that they knew too little about how Facebook would use the information to be able to say for sure. Based on publicly available information, they said Facebook may be taking the wrong approach.
On Jan. 19, Facebook CEO Mark Zuckerberg announced that, to help improve the quality of sources in news feeds, users would be polled on the trustworthiness of different outlets. They would occasionally be asked, “Do you recognize the following websites,” and then, “How much do you trust each of these domains?” Users would then be given five options —from “not at all” to “entirely”—to rate their trust.
Facebook was knocked by some for the poll’s brevity, while others wondered whether partisan bias would affect the trust ratings for mainstream news outlets—particularly those that have come under heavy political attack by Trump.
But Rand said, “This supersimple survey actually does a remarkably good job of separating real news organizations from hyperpartisan or fake news organizations.”
He continued, “Although partisan bias does exist, it’s much smaller than just the baseline ability to differentiate between real credible news outlets and totally unreliable sites.”
The problem, according to Rand, is that Zuckerberg has said Facebook will count users’ trust ratings only for sites that they say they’ve heard of. In other words, if a user’s answer to the first question—do you recognize this website?—is no, their answer to the second question is thrown out.
Rand said that broad unfamiliarity with a site can be a good signal that it’s unreliable. After all, “fake news” is often peddled on URL’s that few would have ever heard of.
In their study, Rand and Pennycook asked more than a thousand people to assess the trustworthiness of 60 news sites—20 mainstream, 20 “hyperpartisan,” like Breitbart, InfoWars and Daily Kos, and 20 “fake news”—using the exact same language that Facebook employs in its surveys. The study, which is not yet peer-reviewed, was performed over Amazon Mechanical Turk, a crowdsourcing marketplace often used to perform academic surveys.
The authors found that trust ratings for fake and hyperpartisan sites were significantly lower than mainstream outlets—as long as all trust ratings were included, regardless of whether the respondent had heard of the site.
“All but one mainstream source, Salon, was rated as more trustworthy than every hyperpartisan or fake news source when equally weighting ratings of Democrats and Republicans,” Rand and Pennycook wrote in their study.
But when only the ratings of those familiar with the outlets were counted, Rand and Pennycook wrote, “Thirty percent of the mainstream media websites (Salon, The Guardian, Fox News, POLITICO, Huffington Post and Newsweek) received lower trust scores than the most trusted fake news site (news4ktla.com) when excluding unfamiliar ratings.”
If all responses are counted, PBS received the highest trust score (.65 on a one-point scale, roughly translating to halfway between trusting “somewhat” and “a lot”), followed by the Times, NBC News and CBS News (all .55).
The lowest of the 20 mainstream outlets was Salon, which rated .19 for all responses, but .34 among those who have heard of it, followed by POLITICO (.29 and .42) and The Guardian (.34 and .42). Rand said the low scores for some outlets could reflect less brand awareness among respondents—the highest scoring outlets were long-standing legacy brands.
That’s largely why fake news sites scored so low, he said. Most came in between .1 and .2 if all responses were considered, but between .24 and .48 if only responses from people familiar with the sites’ URLs were counted.
“If [Facebook is] doing what they say in the post, which is ignoring responses from people who say they’re unfamiliar with the outlet, that is a bad idea,” Rand said.
He added, “The more familiar people are with an outlet, the more likely they are to trust it.”
The downside to such an approach, Rand acknowledged, would be making it more difficult for new or credible but lesser-known sites to break through.
A Facebook spokesperson contested Rand and Pennycook’s conclusion, saying that users could not assess the trustworthiness of a news outlet if they’ve never heard of it.
“People can only trust the source if they know the source,” a Facebook spokesperson said. “To do the alternative would create the possibility a publication gets greater distribution for being trusted, even though the people weighing in have no familiarity with the publication.”
Facebook has said relatively little about how it intends to use information from the survey, though the spokesperson did clarify that its goal is not to favor one credible source over another. For instance, if users respond that The New York Times is more trustworthy than The Washington Post, that does not mean the Times will appear more in users’ news feeds. Instead, the point is to weed out bad sources.
“This is about getting feedback from people who use Facebook in order to improve quality—and fight click-bait, sensationalism and misinformation. It’s not about stack-ranking news organizations. If a broad sample of people recognize a news organization and trust it, that’s a good thing. If they don’t recognize it or don’t trust it, that’s not as good,” the Facebook spokesperson said.
He added, “It’s important to understand this is used as one signal among many for News Feed distribution.”
Rand acknowledged that it’s impossible to know exactly how Facebook is using the information from its surveys internally, and said that the company may well be weighing responses in ways that better separate out “fake news” sites.
“When Facebook gives people a two-question survey, the answers to the survey is not the only thing they have,” Rand said.
The company also has vast reserves of data on users, from demographic information to Facebook activity to—thanks to partner agreements—a large amount about behavior on other sites.
“They could and I would guess are using that additional information to help inform their interpretation of that person’s responses to the trust survey,” Rand said. “But because they’re not telling us, we have no idea what they’re doing and how much sense it makes.”
When Zuckerberg introduced the trustworthiness survey, he framed it as a way to reduce Facebook’s role as a gatekeeper, and return power to users.
“We could try to make that decision ourselves, but that’s not something we’re comfortable with,” he wrote. “We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you—the community—and have your feedback determine the ranking,” Zuckerberg wrote.
He added, “We decided that having the community determine which sources are broadly trusted would be most objective.”
But other researchers who’ve studied trust in news said it’s impossible for Facebook to avoid its gatekeeper role. After all, Facebook can poll its users, but the company ultimately decides how the surveys’ results are assessed and pumped into its algorithms.
Nic Newman, a Reuters digital media strategist, echoed Rand’s point that it is currently not known what other factors—whether location, age, or other characteristics—Facebook will consider when weighting users’ responses.
“I don’t expect them to say, ‘We are going to give you the details of how we’re defining authority or trust in the platform,’ because that could be gamed,” Newman said, adding, “I think at a very high level that’s the sort of thing they need to communicate about.”
Mike Kearney, a journalism professor at the University of Missouri who has studied trust in news, noted how Facebook has said that the survey is just one ingredient in what determines what appears in users’ news feeds—and nobody outside the company knows the rest of the recipe.
“It’s impossible at this point for Facebook to offer their platform to users without being the gatekeeper in some way,” Kearney said, adding, “I would love to see them step up and own that they are an integral part of the flow of information in a democratic society.”
Powered by WPeMatico