Facebook Takes Measures to Become a Trustworthy News Provider

 

Facebook, which is reducing the amount of news in its news feed, in now going to prioritize information from the publishers that remain on the social network by first measuring how trustworthy they are, the company said.

Trustworthiness is based on a recent survey of US Facebook users that made a track of their familiarity with, and trust in, different sources of news that pop up on their news feed.

The results will inform the company’s ranking in the news feed, a stream of updates people see when they log in. News sources should also be “informative” and relevant to people’s local communities, the company said in a statement.

The move helps Facebook avoid perceptions of bias in selecting what news providers to highlight.

“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division,” CEO Mark Zuckerberg wrote in a Facebook post on Friday.
“We could try to make that decision ourselves, but that’s not something we’re comfortable with.”
Publishers expressed concern about the news feed changes announced last week because many news sites have come to depend on how many viewers they get on Facebook.

On Friday, Zuckerberg said he expects news to make up roughly 4 percent of the news feed, down from roughly 5 percent today.

“This is a big change, but news will always be a critical way for people to start conversations on important topics,” he added.

Facebook, which has been constantly talked about due to the spread of fake news on its service, recently said it will reduce the amount of content from brands and other company pages – including those run by publishers – in the news feed. That move refocuses the company on content from friends and family members, taking Facebook back to its roots, but it could mean less time spent on the site, Zuckerberg said last week.

The social network has definitely had trouble managing its role as one of the world’s most powerful news distributors.

Ahead of the US presidential election in 2016, Facebook was criticized for bias because its human curators of a “Trending Topics” section were only allowed to pick links from a set of sources Facebook designated as trusted, which excluded some conservative sites.
Since then, the company has taken measures to address the spread of fake news while trying to avoid being the arbiter of what is true or false.

It works with third-party fact checkers who look at articles flagged by users as potentially false or misleading. Those efforts have had little impact on the overall problem.

 

more recommended stories

Get Amazing Stories

Get great contents delivered straight to your inbox everyday, just a click away, Sign Up Now
Email address
Secure and Spam free...

Get Amazing Stories

Get great contents delivered straight to your inbox everyday, just a click away, Sign Up Now
Email address