Recap: Week 9

Week 9 has ended and we have continued exploring the world of algorithms and turned our attention to the specific applications of algorithms in education in the form of learning analytics.

It was fascinating to see what my colleagues came up with for last week’s exercise in exploring the algorithms used by internet companies to give their users individual suggestions and recommendations. Following up on my own explorations I delved into some issues raised by Jeremy, especially concerning privacy implications – in my mind the most pervasive and pressing issue surrounding the growing use of algorithms in our lives.

Later in the week I turned my attention to Twitter, reading and replying to my colleagues’ tweets as well as participating in this week’s scheduled EDC tweetstorm.

One particular question we were asked was what we give to algorithms and what they give us and I tweeted “We give them our history, they give us our future.” This statement was on purpose meant to be ambiguous. On a more surface level one could interpret it as us giving algorithms our search history or watched videos history and they recommend to us sites or videos we will watch in our future, thus shaping our future. In my opinion this goes even deeper however. Algorithms that predict the weather include not just historical weather data but also our own  current understanding of maths, systems dynamics and meteorology – all developed in time. Especially in light of artificial intelligence it seems more and more likely to me that we are soon to be passing on the torch of knowledge creation to entities that will not be limited by their phyical and biological boundaries.

Considering the issues of learning analytics I see that in the future they will be able to considerably help people in their learning endeavours in a variety of ways. My prediction will be that models based on biofeedback, like heart rate, skin conduction, pupil dilation, blink frequency, brainwave measurements etc will be one day used to guide the student to maximise learning, perhaps by signalling perfect learning window times. As previously mentioned, such massive tracking carries its own set of problems, particularly with regards to privacy and data security.

To round things up this week I stumbled upon a new fun little game by Google that lets you play around with its autocomplete suggestion engine, scoring points for correctly guessing its guesses.

Comment on Exploring Algorithms by mkiseloski

Thanks Martyn and Jeremy for your comments!

@ Martyn
Thank you, I will check out Ghostery. It seems like one more useful tool to protect privacy on the internet.

@ Jeremy
You raise some very interesting points here! I wholeheartedly agree that the issues of privacy need to be taken much more seriously in our public discourse, regardless of whether our data are collected from a government entity or a private business. You are right when you say that information like “likes Latin American music” or “likes winter sports” on their own seem rather inconspicuous, but the point to be made here is that over the more such seemingly useless factoids merge to create a stunningly accurate profile. This reminds me of how after the Snowden leaks people tried justify the warrantless NSA surveillance programs citing that they only collected metadata when in fact metadata (who did you talk to, when were you in what place, where did you use your credit card how much money, where did you go regularly, who was with you during those times, etc.) can potentially present a much more accurate description than the content of phone calls.
I read recently that Uber could easily infer from their user data how likely someone was for having an affair with someone (and where) simply by looking at driving patterns of people regularly driving some place in the evening and driving back home in the early hours of the morning. Knowing that some private company can so easily obtain such sensitive information feels quite unsettling for me.
I think that algorithms first discover identity but that as they become more aware of your existing identity and as they form a filter bubble for you they tend to influence you with their suggestions, possibly shaping your identity as you interact with their services more and more. In any case I think that privacy has to be protected if we want to live in a free society.

from Comments for Mihael’s EDC blog http://edc15.education.ed.ac.uk/mkiseloski/2015/03/07/exploring-algorithms/#comment-609
via IFTTT

Recap: Week 8

The first week of block 3 on algorithmic cultures has ended and as expected it was a very informative experience delving into the depths of algorithmic cultures.

To finish up last week’s ethnographic song I started this week’s lifestream with some more detailed observations about the songwriting MOOC’s community culture.

Next, I dove right into this weeks topics of algorithms with a fantastic talk by Kevin Slain on how algorithms shape our world. Mr Slain shows the fascinating world of financial algorithms that have largely taken over control of the global financial system in the last decade. Expanding on this trend I can wholeheartedly recommend last year’s bestseller book “Flash Boys” by Michael Lewis.

Staying on the topic of algorithms and the financial sector I started to explore the topic of cryptocurrencies such as BitCoin which is based on the blockchain algorithm, a cryptographically protected trustless accounting ledger that many people see as one of the biggest revolutions of the 21st century. Building on the principle of the blockchain developers have already started to expand the framework to not only include currencies but also social contracts, financial markets, property rights and e-government. A truly fascinating Harvard lecture on Ethereum, the platform on which these blockchain contracts are being executed, shows the dangers and possibilites of these brand new developments.

The following day I once more turned my attention to this exceptionally well written and thoroughly researched longform article about artificial intelligence – an incredibly long, but nonetheless fascinating, thought provoking and even frightening elaboration on the implications of artificial superintelligence. Expanding on that, I posted a video by Nick Bostrom, one of the more prominent thinkers in this field.

Finally, I posted my findings on this week’s task of exploring and playing around algorithms employed by technology companies in their web offerings. I used the YouTube recommendation engine to examine this topic, finding plenty of evidence for the often cited “filter bubble”.

It will be interesting to see next week how algorithms can be used to specifically improve education outcomes.