facebook user trust

May 12, 2016: Facebook found itself in the crosshairs of conservatives this week when it was alleged that the social network — whose “trending topics” feed is curated by human editors — routinely suppressed news from right-leaning news outlets such as Breitbart.com.

Facebook spokesman Tom Stocky acknowledged the seriousness of the claims in a Facebook post, stating that the service – after an internal investigation — had “found no evidence that the anonymous allegations are true.” Comments to this post have been numerous (currently more than 500) and for the most part negative.

Algorithms and User Trust

We live in an age in which society appears to place more faith in the ostensibly neutral properties of algorithms than in human judgments, which are by definition subjective and often flawed.

Laws prohibit many forms of discrimination when such acts are the product of human will, but when an algorithm makes the same decision, we tend to give it a pass because it’s merely reflecting an objective circumstance in the world — for example, one’s social media ranking, credit rating, or other neutral metric considered to be a reliable proxy for merit-worthiness.

In this case, however, Facebook’s decisions as to what to put in its Trending Topics area was a hybrid human-machine process. Human editors cherry-picked items suggested by Facebook’s algorithm, thereby putting the stamp of human responsibility on them. Facebook — in other words — behaved very much like a traditional newspaper or magazine in terms of what goes on its front page.  This was a surprise to many, given that Facebook’s own official explanation of how it selects Trending Topics made no mention of how humans function in the loop.

Trending shows you a list of topics and hashtags that have recently spiked in popularity on Facebook. This list is personalized based on a number of factors, including Pages you’ve liked, your location and what’s trending across Facebook.

Why this matters

Social media’s role in the political process is coming under more scrutiny because of its huge role as a primary news source. As Pew reported last year,

Clear majorities of Twitter (63%) and Facebook users (63%) now say each platform serves as a source for news about events and issues outside the realm of friends and family. That share has increased substantially from 2013, when about half of users (52% of Twitter users, 47% of Facebook users) said they got news from the social platforms.

At the same time, however, it appears that many users have trust issues with the service. In an April 2016 poll conducted by the Huffington Post and YouGov, 28 percent of respondents said that they do not trust Facebook “at all” with their personal data, with 34 percent saying that they trusted Facebook “not very much.” Only 3 percent stated that they completely trust Facebook with their personal data.

While Facebook’s new difficulties relate to news bias, not trust over data security, the two issues are related. Trust is a generalized property that attaches to a brand and may be reduced by many different actions. If a user becomes less trustful of what one sees in one’s Trending Topics area, or other ostensibly “neutral” or “algorithmically-mediated” content area, this lack of trust may spill over into other areas.

User trust goes to the heart of why people use Facebook to begin with, why they stay there, and their continuing decisions to engage and share on the service. In fact, Facebook mentions trust specifically among the Risk Factors posted — as part of their obligations as a public company — in its most recent quarterly report filed with the SEC.

If people do not perceive our products to be useful, reliable, and trustworthy, we may not be able to attract or retain users or otherwise maintain or increase the frequency and duration of their engagement.

User trust — in other words — is central to Facebook’s fortunes. Without it, users will no longer be attracted to the service, will not stay on it, and will not share on it. Facebook knows this, and is likely aware that the current flap over news bias has the potential to spill over into a more generalized distrust concerning the service.

This is a scenario that Facebook cannot afford to let happen, and it will be very interesting to see in the next weeks and months how far it must go to restore trust in the information it provides to its users, through algorithms, human curation, or via other means.

(Update 5/12/2016: Facebook has published more details about how Trending Topics are selected in an official post by Justin Ostrovsky, Facebook’s Vice-President of Global Operations. Mr. Ostrovsky’s post contains a list of the news sources that Facebook’s algorithm uses to determine which topics are trending, and the actual guidelines document used by its editorial team when selecting among the algorithmically-surfaced suggestions.)

 

Didit Editorial
Summary
Facebook, news bias, and user trust
Article Name
Facebook, news bias, and user trust
Description
Now that Facebook is considered a major news source, user trust is becoming more important than ever.
Author
The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.