Earlier this year, a troubling report came out where former Facebook contractors claimed that they were given orders to manipulate the trending topics and articles that Facebook presents to its users. More specifically, these contractors claimed that Facebook wanted them to not only remove some political articles from conservative sources, but additionally they were told to inject some articles into the trending section even though these articles hadn’t generated enough interest to trend organically.
The idea that social media sites like Facebook and Twitter might be ‘curating’ their Trending Topics instead of letting them trend organically isn’t new. Personally, I’ve noticed that many topics and links seem to trend on Twitter without seemingly having the engagement levels they would need to trend organically. This seemed to intensify as we neared the election in November, I noticed multiple political articles from The New York Times trending almost daily, while no other news source could get even one political article to trend regularly. Other topics would be trending with only a dozen or so tweets, which seems impossible.
#TataLies @TwitterIndia @twitter how is this a trending topic with less than 10 tweets???
— Rakesh Kumar (@rakeshzin) December 8, 2016
All of this helps create the suspicion that maybe the ‘trending’ topics we are getting on social media sites are actually being curated for us by not the users, but the people running those sites. Add to this issue the latest flack over ‘fake news’. Whenever you see the term ‘fake news’ it typically refers to websites that run stories making claims that either cannot be substantiated, or are ‘sourced’ by websites or organizations that don’t actually exist. Sometimes the claims are outright lies, all are designed to get clicks. Even this phenomenon can have grey areas. If a supposedly political website makes a bizarre claim about Trump that they source via a fictional newspaper, that’s pretty clearly fake news. But if CNN takes a Trump quote from a rally out of context to present a point of view that they know Trump didn’t intend, is that fake news? It can get murky sometimes to know what ‘news’ is news, and what is ‘fake’.
Recently, Facebook announced that it was going to start leaning on outside sources to help it decide what is and is not ‘fake news’. Facebook wants to first make it much easier for its users to flag and report news that it feels is ‘fake’. If an article gets enough flags, it will then be sent to an editorial board for review. Facebook has recently said that representatives from groups such as Snopes, the Associated Press, Politifact, and ABC News would then review the articles and decide if they should be banned or not from Facebook.
This potentially creates a new problem: “Who checks the fact-checkers?” Many conservatives would argue that all of the above listed sources tend to lean toward the left in their political biases. Basically, seeing that the AP, ABC News or Snopes will be helping Facebook decide what is and is not ‘fake news’ raises the same concerns for conservatives that it likely would for liberals if Fox News was doing the vetting.
All of this, whether it is ‘fake news’ or questions over trending topics, has created a bit of a trust problem for social media sites like Facebook and Twitter. It’s difficult, if not impossible to tell how topics do or do not trend. A lack of understanding leads to a lack of trust, and right now, most of us have no idea how or why Twitter and Facebook decides what topics do or do not trend. Most of us assume that the topics that the most people are talking about will be the topics that trend. Facebook and Twitter both attempt to tailor trends by taking into account what the people in your network are talking about. All of this is fine, in theory.
But if major social media sites like Twitter and Facebook want its users to trust the trending topics it shows us, they need to do a better job of being transparent about how they arrived at that list. Now, more than ever, people are more suspicious of ‘the media’ and are more likely to assume that information is being altered to further a particular slant, versus simply reporting the news and letting us decide. At this point I’m more worried about the validity of the ‘trending topics’ process, than I am the validity of the sources of information that are trending.
Sara Kubik says
Let’s add to this great article the aspect that users are either unaware of the tailoring of the content or do not kniw how to change it to be unbiased
For example, how many Twitter users know their trends are Tailored Trends? How many know they can change this to be location specific? On a mobile device (the way the majority access then Internet now, according to Pew Research), this is hard to do.
The bigger issue here, which encompasses fake news, is the tailoring of what we see. Google tailors our searches and our social media newsfeeds are highly tailored.
Sometimes we want this. Sometimes not. The solution is to easily and *clearly* let the users decide. This is another aspect of transparency that your article hits upon.
Mack Collier says
Thank you Sara, great points. One of the things I really focused on during the past US election was how different media sources reported the same news in completely different ways. If you watched the same event being reported in real-time on say CNN vs Fox News, the differences in how the same information was being presented was glaring, and very concerning.
Then you add to that, by going to Twitter and Facebook and noticing which terms are trending associated with the news, and which terms were curiously absent. The bottom line is that both Facebook and Twitter are publicly-traded companies now, and both should be held to a higher level of transparency, not less.
I still want to know how the New York Times is able to get at least 1 political article daily to trend on Twitter, when no other news source on the planet can duplicate this. Surprised someone hasn’t done a case study on this.