Facebook’s news feed is probably the most-used feature of the social network. It organises posts, photos, links and advertisements from your friends and the pages you follow into a single stream of news. But lately we’ve seen the news feed making headlines of its own.
In August, users and journalists began to question Facebook’s news feed after noticing a scarcity of links and posts about the death of Michael Brown and the subsequent protests in Ferguson, Missouri.
Facebook also announced changes to the news feed to decrease the visibility of clickbait-style headlines. These are headlines that attempt to lure visitors to a webpage with intriguing but uninformative previews, and Facebook made up a typical example.
Facebook says it will be tracking the amount of time that users spend on a website after clicking such a link, and penalising the publishers of links that don’t keep reader attention.
In June, Facebook faced criticism after the publication of research findings based on an “emotional contagion” experiment that manipulated the news feed of almost 700,000 users. It raised some ethical concern among both Facebook users and observers.
Given how little we understand of Facebook’s internal affairs and the machinations of the news feed’s filter algorithms, the growing public concern around Facebook’s operations is understandable.
Why do the algorithms matter?
As users, our readiness to trust Facebook as a hub for social, professional and familial interactions, as well as a source for following and discussing news, has afforded the company a privileged position as an intermediary in our social and political lives.
Twitter CEO Dick Costolo’s announcement that Twitter decided to censor user-uploaded images of American journalist James Foley’s execution is a timely reminder of the many roles of social networking platforms.
These platforms and their operators do not simply present data and human interaction in a neutral way — they also make editorial judgements about the kinds of data and interaction they want to facilitate.
This should lead us to question the ways in which Facebook’s roles as an intermediary of our information and social connections allows their operators to potentially influence their users.
Why does Facebook need algorithms to sort the news?
One of the most common responses to criticism of the news feed is the suggestion that Facebook does away with sorting entirely, and simply show everything chronologically — just like Twitter.
Showing everything can make the news feed seem a bit more like a news firehose. Facebook engineers estimate that the average user’s news feed would show around 1,500 new posts each day.
The “firehose model” is not without its own issues. By showing all posts as they happen, Twitter’s approach can tend to favour the users who post most often, and that can let the noisiest users drown out other worthy voices.
This concern may be an influence on Twitter’s recent changes to show tweets favourited by other followers in a user’s timeline, and its apparent readiness to experiment with algorithmic changes to their users’ timelines.
Algorithmic filtering may well be helpful given the amount of information we deal with on a day-to-day basis but the unexplained “black box” nature of most algorithmic systems can be headache too.
Changes to Facebook’s algorithms can dramatically affect the traffic some websites receive, much to the chagrin of their publishers. Publishers who have registered with Facebook receive some basic metrics as to the number of users who have seen their post. Individual users receive even less feedback as to how widely (if at all) their posts have been seen.
These algorithms are ostensibly created by the developers of Facebook and Twitter in service of creating a better experience for their users (both individuals and corporate).
But social platforms have a vested interest in keeping users engaged with their service. We must recognise that these interests can shape the development of the platform and its functions.
A social network’s filtering may be biased against showing content that engineers have deemed controversial or potentially upsetting to help users enjoy the the network. These filters could stop you from seeing a post that would have upset you but they might also limit the visibility of a cry for help from someone in need.
Are there antidotes to algorithms?
If users are concerned by the choices that a social media platform seems to be making, they can demand a greater degree of transparency. That being said, these systems can be complex. According to Facebook, more than 100,000 different variables are factored into the news feed algorithms.
Another option might be to regulate: subject sufficiently large technology companies and their social algorithms to regular independent auditing, similar to the regulations for algorithmic financial trading.
Alternatively, users could use the platform in unintended ways or learn to subvert and scam the system to their own advantage.
Users could also lessen their usage of Facebook and seek a less-filtered stream of news and information from a variety of other sources to suit their needs.
For better or worse, algorithmic filtering will likely become a staple of our data-fuelled, internet-mediated lives, but in time we may also see services that give users more direct control over the algorithms that govern what they get to see.
Andrew Quodling does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.
This article was originally published on The Conversation. Read the original article.
COMMENTS
SmartCompany is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while it is being reviewed, but we’re working as fast as we can to keep the conversation rolling.
The SmartCompany comment section is members-only content. Please subscribe to leave a comment.
The SmartCompany comment section is members-only content. Please login to leave a comment.