Adventures in algorithms

This week I went back to Netflix to see what suggestions were waiting for me. Below was a challenge they threw down… Show us what you like, choose some films, we’ll help you find series and movies that you like.

Screen Shot 2015-03-08 at 7.16.38 PM

I had a look at their options and chose these titles:

Screen Shot 2015-03-08 at 7.18.04 PM

Their suggested films were:

Screen Shot 2015-03-08 at 7.18.32 PM

In the initial section, there were choices of films that I had recently watched and ranked, and other films closely related to the genres of film I had been watching.

What was interesting is that I wasn’t offered a broader range of films. I was seemingly typecast :-) into liking rom-coms, superhero based action films, and documentaries on drug traffickers. I didn’t have a mixture of different genres to choose from. My choices, right from the beginning, were limited to recent viewing folly.

I tried to cheat the system. I had a look at the options that were being presented and choose children’s cinema and the genre of horror. My selection looked like this:

Screen Shot 2015-03-08 at 7.20.08 PM

Netflix suggested these titles:

Screen Shot 2015-03-08 at 7.20.27 PM

Who knows where they got drop dead diva from… Possibly my wife’s viewing habits…

Screen Shot 2015-01-22 at 3.13.10 PMI guess the point here is that I have been placed in a filter bubble, the suggestions being based on my recent viewing habits. Harmless here, but what if it was a political view being reinforced? What if it was a perspective on a current affair or report on a certain issue I may not have heard about.

Applying this in Education

Specialization is a wonderful thing. The deeper investigation into one area of study is something that is usually undertaken at Master’s or Doctorate level. I fear that allowing learners to choose their own bespoke route of studies, informed by algorithms, could lead to early specialization in a certain area and lose that rich, transversal, cross-curricular blend that multiple discipline studies create. Allowing algorithms to suggest or guide our educational journey could create similar outcomes in our learners, one that does not perhaps support diversity.

4 thoughts on “Adventures in algorithms

  1. Pingback: Week 9: learning analytics and calculating academics | MSc in Digital Education at the University of Edinburgh

  2. PJ Fameli

    Matryn, thx for concise,straightforward and instructive algorithmic play. I have been delaying signing up for Netflix for past couple years, as I have clung to my portable DVD player. But now that you have educated me on what to watch out for, I will probably break down and start watching Scandinavian films on it. Cheers, PJ

    Reply
  3. Jeremy Knox

    This is a super outline of your algorithmic play Martyn, and fascinating to see Netflix recommendation at work.

    ‘My choices, right from the beginning, were limited to recent viewing folly’

    Yes, really interesting, particularly as Netflix seem to claim a somewhat more sophisticated algorithm, and also that they highlight the differences between what people say they watch, and what they actually watch (http://www.wired.com/2013/08/qq_netflix-algorithm/). Where is the scope to be surprised by something new in your recommended films? This is the crux of the you loop issue.

    ‘Harmless here, but what if it was a political view being reinforced? What if it was a perspective on a current affair or report on a certain issue I may not have heard about.’

    Great point here. While one might argue that Hollywood films are relatively simple to categorise into genre, there is a problem here between how the algorithm might classify something, and how any one individual might interpret that area, isn’t there? So, staying with the film example,can we assume that people actually like films according to genre in all cases? Don’t people have more nuanced interpretations of films, and what they might like about them? For example, I think The Shining (http://en.wikipedia.org/wiki/The_Shining_%28film%29) is probably one of my favourite films, but I wouldn’t say I’m a fan of a ‘horror’ genre. There are other things I like about the film that I’m not sure I can even articulate fully what those things are and *why* I like it – so can an algorithm really do that for me?

    That relates to your point here, because there are perhaps more serious implications around the kind of knowledge we have access to through algorithms (Google Search being a prime example). So, my argument would be that when we try to model the ‘likes’ and ‘dislikes’ of people, it is not only done in a hugely simplistic way, but the results also feed back formative ideas about our selves that are quite powerful (Oh, so I must be a horror fan!). Rather than think about what it is we like about films, the algorithm gives us an answer. When one thinks about this in terms of politics, as you say, the implications may be pretty significant.

    You also raise a really good point about the automated ‘personalisation’ of education and the loss of diversity. The assumed transparency and authenticity of the algorithm is central here again I think: if an algorithm tells you that you are a ‘visual learner’ and that you prefer mathematics-related subjects, that is a pretty powerful message to somebody who might be feeling pressured to find direction and focus in their choice of educational pathway?

    Reply
    1. Martyn Post author

      Looking through Rebecca Enyon’s paper on the rise of big data she really highlights the more concerning issues related to algorithms and their application. When she writes about using data to predict whether or not a university student will drop out and whether universities should accept them, for me these issues are really tough ethically. Also, the idea that a student’s progress can be tracked throughout school, unless it is used for remedial work and support, could be damning and condemning if used for other means. The tracking of progress could give way to a kind of modern day 11 plus exam. One quote that really also stood out was;

      “What happens to serendipity in a system where all educational choices are based on recommender systems” (Enyon, 2013, 238)

      I think this really echoes the points you were making about Netflix above Jeremy, and, as much as I love structured studying, I think stumbling upon ideas and concepts through education is also really important.

      Rebecca Eynon (2013) The rise of Big Data: what does it mean for education,
      technology, and media research?, Learning, Media and Technology, 38:3, 237-240, DOI:
      10.1080/17439884.2013.771783

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>