Should We Be Wary of Algorithms?

Via @nprnews: Should We Be Wary of Algorithms? http://t.co/7Y9eu4mcwk #mscedc

week_8_2

 

TED has invited me to ‘share this talk and track [my] influence!’.  How can I possibly resist?  It is very clever of them.  Because now I want to find other TED offerings to share with my network. I want to Tweet them, to Pin them and embed them all over the social net. And then I want to look at the numbers.  week_8_4

But what is TED going to do with my numbers?  I know they store information such as the Talk I shared, when I shared it and the number of people who followed the link, because they tell me that is what they will do.  But what else could they be doing?  They have my email address, my twitter username.  They entice me to complete my profile and add tags about myself.week_8_3

I could give them a lot of information about myself; but I don’t need to do this because they probably can work it out for themselves if Jennifer Golbeck is to be believed (see The curly fry conundrum: Why social media “likes” say more than you might think).  There are algorithms which can accurately predict many of my traits based on my use of social media.

So back to my TED numbers.  Why do they want them? Their Privacy Policy reassures me that it is all for my own good – “We may use this information to help customize your TED.com experience based on your previous activities on TED.com.” They also warn me that other social media outlets may use my data in other ways when I share via them.  Although I believe them I will resist completing my profile. I don’t want to end up in some ‘filter bubble‘ where TED decides what I should be watching.

 

4 comments

  1. Jeremy Knox says:

    Really interesting example here Jin, and brilliant to link this directly to the Slavin TED Talk.

    One of the things you highlight here is the kind of trade that is happening with our personal data on the web. The ‘Share this talk and track your influence’ caption seems to acknowledge that this isn’t about something underhanded – we know that we’re giving away our personal data to algorithms, so the incentive is that we get something back to compensate. The more you give away, the more you supposedly gain from the loss – in this case the gain seems to be viewing how much attention you’ve gained from sharing the video.

    The ‘I am…’ screenshot is fascinating isn’t it? I wonder if you could say more here in terms of the ‘you loop’ idea. While that is quite a long list, what are the implications of selecting one of those options, and saying ‘yes I am this category’. Is it just about what information it might feedback in the future, or also something more about our own identity formation?

    So, is the solution to ‘resist’? That seems to imply that we get nothing in return, or at least that what we get in return is not as valuable as what we give away. I wonder if resistance could be interpreted as selfish in any way? For example, if these algorithms potentially improve information retrieval for ‘society’, then our non-participation would limit (at least the diversity) of what these systems could achieve with more data?

  2. Jin Darling says:

    Hi Jeremy,
    “what are the implications of selecting one of those options, and saying ‘yes I am this category’.”
    I find the whole labelling thing really invasive; I don’t want some large organisation having this information about me. I want to control my data, and handing it over feels like I am giving them some of the control. I know this will be used recommend talks to me, but I am more than capable of finding interesting stuff for myself, indeed I like the serendipitous element of browsing randomly and following links (that’s how I found the MScDE course in the first place!). Life works well for me when I leave some things to chance.
    I can’t label myself – tomorrow I may feel differently. Tomorrow I may be interested in something else entirely. The labels they suggest are too personal, if they had suggested areas of interest then I may have ticked some, or even better, suggest areas I am not interested in. I feel as if they are coercing me to give them information by using clever psychological tricks, suggesting I am shy or that I want to know how much / little influence I have with others.

    • Katherine says:

      Thanks for this Jen! I think it’s fascinating how they use emotional blackmail and peer pressure to make you create a profile. ‘Shy’ and ‘private’ are less criticisms in ‘intraverted’ cultures like you have in the UK, but as Susan Caine remarked in her TED talk (and in her book Quiet), it is quite a slur in American culture which prizes extraversion: The power of introverts.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>