Read / Social Media and Society

The Ethics of Social Media News Feeds

We’ve come to expect a more personalised experience in almost all of our dealings online. Whether it be online shopping, or interacting with a customer service representative, we expect corporations to know who we are, what our purchases have been in the past, and what our preferences are. Customer relationship management systems (CRMs) are prevalent throughout industry, and are designed to personalise the experience customers have with an organisation – it’s all about engaging them at the right time, through the right channel, with the right information.

And it’s probably no surprise to most of us that Facebook – the most popular social network worldwide – customises the information we see. But they also ran into a spot of trouble a few years ago when they manipulated people’s news feed by making either negative or positive posts more prominent. So where do we draw the line? How do we, and society in general, decide what is ethically acceptable, and what isn’t? This isn’t a question that’s easily answered – nor is there necessarily a ‘correct’ answer – but it’s a question that’s continually being raised, particularly as this control of information is constantly being optimised and adopted by more organisations (such as is the case with Instagram).

This week, we’ll be taking a look at the ethics of personalised news feeds and information manipulation. We’ll delve into how and why companies do this, and invite you to make your own judgement about whether or not you think this is right.

First, we’ll take a look at one example that most of us are likely familiar with – Facebook’s news feed. Facebook introduced what we know as the news feed in 2006, with the ‘like’ button arriving on the scene 3 years later. According to internal research conducted by Facebook, the average user has 1,500 new updates, photos, and content since their last visit. (For those with a large friend network and numerous subscriptions, this figure could reach 15,000.) As a result, one of Facebook’s key tasks is to ‘curate’ the news post to show the most interesting updates first – i.e., content of most interest to the individual based on the friends they interact with the most. This is influenced by a variety of factors, such as the number of likes and comments the post has received, and what the individual has clicked on in the past. However, these factors are constantly being refined and recalibrated in Facebook’s algorithm. Currently, approximately 300 posts out of the 1,500 are chosen for an individual’s news feed.

Week 6 post SM

Figure 1 – PostRocket’s infographic illustrating certain data points used by Facebook (image sourced from Ref. 3)

Facebook’s algorithm for this ordering has been referred to as “one of the world’s most influential, controversial, and misunderstood algorithms”. So why is the process in which our newsfeed is curated and ranked so important? Essentially, with more than 1 billion daily active users, the results of the ranking process have the potential to influence the reading habits and social lives of approximately one-fifth of the world’s adult population. The algorithm also has flow-on effects regarding human behaviour and action. Many sociologists argue that the manner in which Facebook is structured motivates people to post feel-good stories – for example, during the time of the ALS ice bucket challenge, videos of these often took up more of a person’s news feed than more challenging stories, such as the Ferguson shooting and protests. Zeynep Tufecki, a sociologist at the University of North Carolina reiterates this, stating “Things that are likable get more feedback… And because we adapt our behaviour over time, Facebook is, in a sense, influencing people to behave in a certain way as a result of their algorithm”.

This medium of receiving news is very different to traditional media. Historically, media organisations have determined news stories that matter based on their own editorial judgement – generally based on such values as trust, newsworthiness, and public interest. However, with Facebook, the process of determining what appears as the prominent items on the news feed is devoid of any editorial stamp – rather their ideal is to use an algorithm to sort the feed based on what each individual would like to see first. Even Google personalises search results based on previous searches and the links subsequently clicked on. More specifically, they factor in 57 data points before presenting search results back to you, e.g. your location, the computer being used, as well as the browser you are using to search. As a result, the same search query could produce different search results for different people. This raises many ethical questions and concerns, which may be summed up in what has been described as ‘the filter bubble’.

Week 5 Post 2 SM

Figure 2 – Image of a Google search with the same input search term, ‘gun control’, but yielding different results (image sourced from Ref. 5)

This term was coined by Eli Pariser in 2011 to characterise the individual information ‘bubble’ of the personalised content we receive online. The filter bubble presents individuals with personalised content that reflects on their own world view, whilst hiding information which does not align with it, or content which an algorithm determines may not be of interest. Pariser comments on the the dangers of excessive personalisation, warning how these filters “serve up a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown”.

While this view may seem a bit extreme, it is important to keep in mind the natural human tendency to avoid information and opinions we don’t agree with. The issue with excessive personalisation is the compounding of this problem, leading individuals to be surrounded mostly by people they like and content they agree with. As such, there is growing concern that users may end up only seeing what they want to see, and not necessarily the content that is considered important. For example, Facebook uses an individual’s previous interactions to predict who, and what type of content, is of the most interest and importance to that individual. So, for argument’s sake, if you are a Conservative who mostly clicks on links from other Conservatives, you’re less likely to see updates from friends that post Liberal-related content. Essentially, this can reduce people’s exposure to diverse points of view, which would otherwise enrich their understanding of the world. Taking a pessimistic angle, this could subsequently lead to greater polarisation of views, narrowed perceptions, and people living in their own ideological bubble.

Week 5 Post 3 SM

Figure 3 – image illustrating the filter bubble’s potential to limit an individual’s exposure to contrasting, yet relevant points of view (image sourced from Ref. 7)

So is this personalisation ultimately a danger or a blessing? There are no easy answers, and the discussion continues on with both sides of the debate waging valid arguments.

Personalisation aims to ensure the relevance of information presented, whilst also assisting to reduce information overload. With an abundance of information and content, content providers seek to ensure that users receive the information they find most interesting, with the ultimate goal of ensuring they continue to keep using the site.

However, those against excessive personalisation argue that what users have liked and viewed in the past should not dictate what we ‘should’ be seeing in our news feed and search results. Some argue that computers and algorithms are no substitute to human curation. They would also like visibility when websites are filtering content to users, as well as greater transparency about how such websites are filtering and sorting content. Currently, Facebook and Google do allow users to switch off some of their filters, though many people may not be aware of this.

In our post today, we’ve only just scratched at the surface of this topic, but it is one that will continue to be an important issue and source of debate. We’re interested to hear what you think, so let us know your thoughts by commenting below.


References

  1. Bozdag, V.E. and Timmermans, J.F.C., 2001, September. Values in the filter bubble Ethics of Personalization Algorithms in Cloud Computing, In 1st International Workshop on Values in Design–Building Bridges between RE, HCI and Ethics, Lisbon, Portugal, 6 September 2011
  2. Catone, J., 2011, ‘Why Web Personalization May Be Damaging Our World View’, Mashable Australia, accessed 25 April 2016, <http://mashable.com/2011/06/03/filters-eli-pariser/#Wi_MsGClYEqg>
  3. Dyer, P., 2013, ‘Understanding Facebook Edgerank Infographic’, Social Media Today, accessed 25 April 2016, <http://www.socialmediatoday.com/content/understanding-facebook-edgerank-infographic>
  4. Emerging Technology from the arXiv, 2013, ‘How to Burst the Filter Bubble that Protects Us from Opposing Views’, MIT Technology Review, accessed 25 April 2016, <https://www.technologyreview.com/s/522111/how-to-burst-the-filter-bubble-that-protects-us-from-opposing-views/>
  5. gHacks Technology News, photograph, viewed 24 April 2016, <http://cdn.ghacks.net/wp-content/uploads/2012/10/google-filter-bubble.jpg>
  6. Hutchinson, A., 2016, ‘How Facebook’s News Feed Works – As Explained by Facebook’, Social Media Today, accessed 25 April 2016, <http://www.socialmediatoday.com/social-networks/how-facebooks-news-feed-works-explained-facebook>
  7. Infinite Unknown, 2011, photograph, viewed 24 April 2016, <http://www.infiniteunknown.net/2011/06/23/facebook-filter-bubble-is-a-sinister-phenomenon/>
  8. Ingram, M., 2015, ‘Facebook ‘filter bubble’ study raises more questions than it answers’, Fortune, accessed 25 April 2016,<http://fortune.com/2015/05/07/facebook-filter-bubble-doubts/>
  9. Instagram, 2016, ‘See the Moments You Care About First’, Instagram, accessed 22 April 2016, <http://blog.instagram.com/post/141107034797/160315-news>
  10. Luckerson, V., 2015, ‘Here’s How Facebook’s News Feed Actually Works’, Time, accessed 23 April 2016, <http://time.com/3950525/facebook-news-feed-algorithm/>
  11. Morozov, E., 2011, ‘Book Review – The Filter Bubble by Eli Pariser’, The New York Times, accessed 25 April 2016, <http://www.nytimes.com/2011/06/12/books/review/book-review-the-filter-bubble-by-eli-pariser.html>
  12. Oremus, W., 2016, ‘How Facebook’s News Feed Algorithm Works’, Slate, accessed 23 April 2016, <http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html>
  13. Popova, M., 2015, ‘The Filter Bubble: Algorithm vs. Curator & the Value of Serendipity’, BrainPickings, weblog, accessed 23 April 2016, <https://www.brainpickings.org/2011/05/12/the-filter-bubble/>
  14. RT News, 2015, ‘Lifehack: Take back control of your Facebook news feed in minutes with See First’, RT News, accessed 23 April 2016, <https://www.rt.com/news/310226-facebook-news-feed-prioritize/>
  15. Statista, 2016, ‘Leading social networks worldwide as of April 2016, ranked by number of active users (in millions)’, Statista, accessed 22 April, <http://www.statista.com/statistics/272014/global-social-networks-ranked-by-number-of-users/>
  16. Weisberg, J., 2011, ‘Bubble Trouble – Is Web personalization turning us into solipsistic twits?’, Slate, accessed 25 April 2016, <http://www.slate.com/articles/news_and_politics/the_big_idea/2011/06/bubble_trouble.html>
Advertisements

2 thoughts on “The Ethics of Social Media News Feeds

  1. Very interesting. I think it’s fine that private companies maximise revenue by keeping their users engaged, but what I think they would have less of an issue putting people in filter bubbles if they broadened their metrics of “user engagement”. Click-events like the Like Button and clicking on a story are OK, but what about using metrics that track how long a user spends actively reading a page? Surely social media users will spend longer engaging with something that is unfamiliar but stimulating, compared to something familiar that they can Like quickly.

    Like

  2. Pingback: The Filter Bubble | The Mavericks

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s