This week I went back to Netflix to see what suggestions were waiting for me. Below was a challenge they threw down… Show us what you like, choose some films, we’ll help you find series and movies that you like.

I had a look at their options and chose these titles:

Their suggested films were:

In the initial section, there were choices of films that I had recently watched and ranked, and other films closely related to the genres of film I had been watching.
What was interesting is that I wasn’t offered a broader range of films. I was seemingly typecast
into liking rom-coms, superhero based action films, and documentaries on drug traffickers. I didn’t have a mixture of different genres to choose from. My choices, right from the beginning, were limited to recent viewing folly.
I tried to cheat the system. I had a look at the options that were being presented and choose children’s cinema and the genre of horror. My selection looked like this:

Netflix suggested these titles:

Who knows where they got drop dead diva from… Possibly my wife’s viewing habits…
I guess the point here is that I have been placed in a filter bubble, the suggestions being based on my recent viewing habits. Harmless here, but what if it was a political view being reinforced? What if it was a perspective on a current affair or report on a certain issue I may not have heard about.
Applying this in Education
Specialization is a wonderful thing. The deeper investigation into one area of study is something that is usually undertaken at Master’s or Doctorate level. I fear that allowing learners to choose their own bespoke route of studies, informed by algorithms, could lead to early specialization in a certain area and lose that rich, transversal, cross-curricular blend that multiple discipline studies create. Allowing algorithms to suggest or guide our educational journey could create similar outcomes in our learners, one that does not perhaps support diversity.