<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>Comments on: Should We Be Wary of Algorithms?</title>
	<atom:link href="https://edc15.education.ed.ac.uk/jdarling/2015/03/07/via-nprnews-should-we-be-wary-of-algorithms-httpt-co7y9eu4mcwk-mscedc/feed/" rel="self" type="application/rss+xml" />
	<link>https://edc15.education.ed.ac.uk/jdarling/2015/03/07/via-nprnews-should-we-be-wary-of-algorithms-httpt-co7y9eu4mcwk-mscedc/</link>
	<description>Just another Education and digital culture 2015 site</description>
	<lastBuildDate>Fri, 03 Apr 2015 15:22:50 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=4.2.38</generator>
	<item>
		<title>By: Katherine</title>
		<link>https://edc15.education.ed.ac.uk/jdarling/2015/03/07/via-nprnews-should-we-be-wary-of-algorithms-httpt-co7y9eu4mcwk-mscedc/#comment-262</link>
		<dc:creator><![CDATA[Katherine]]></dc:creator>
		<pubDate>Sun, 15 Mar 2015 04:23:29 +0000</pubDate>
		<guid isPermaLink="false">http://edc15.education.ed.ac.uk/jdarling/?p=586#comment-262</guid>
		<description><![CDATA[Thanks for this Jen! I think it&#039;s fascinating how they use emotional blackmail and peer pressure to make you create a profile. &#039;Shy&#039; and &#039;private&#039; are less criticisms in &#039;intraverted&#039; cultures like you have in the UK, but as Susan Caine remarked in her TED talk (and in her book Quiet), it is quite a slur in American culture which prizes extraversion: &lt;a href=&quot;http://www.ted.com/talks/susan_cain_the_power_of_introverts?language=en&quot; rel=&quot;nofollow&quot;&gt;The power of introverts&lt;/a&gt;.]]></description>
		<content:encoded><![CDATA[<p>Thanks for this Jen! I think it&#8217;s fascinating how they use emotional blackmail and peer pressure to make you create a profile. &#8216;Shy&#8217; and &#8216;private&#8217; are less criticisms in &#8216;intraverted&#8217; cultures like you have in the UK, but as Susan Caine remarked in her TED talk (and in her book Quiet), it is quite a slur in American culture which prizes extraversion: <a href="http://www.ted.com/talks/susan_cain_the_power_of_introverts?language=en" rel="nofollow">The power of introverts</a>.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Jin Darling</title>
		<link>https://edc15.education.ed.ac.uk/jdarling/2015/03/07/via-nprnews-should-we-be-wary-of-algorithms-httpt-co7y9eu4mcwk-mscedc/#comment-181</link>
		<dc:creator><![CDATA[Jin Darling]]></dc:creator>
		<pubDate>Tue, 10 Mar 2015 17:35:08 +0000</pubDate>
		<guid isPermaLink="false">http://edc15.education.ed.ac.uk/jdarling/?p=586#comment-181</guid>
		<description><![CDATA[Hi Jeremy,  
&quot;what are the implications of selecting one of those options, and saying ‘yes I am this category’.&quot;
I find the whole labelling thing really invasive; I don&#039;t want some large organisation having this information about me. I want to control my data, and handing it over feels like I am giving them some of the control.  I know this will be used recommend talks to me, but I am more than capable of finding interesting stuff for myself, indeed I like the serendipitous element of browsing randomly and following links (that&#039;s how I found the MScDE course in the first place!). Life works well for me when I leave some things to chance.   
I can&#039;t label myself - tomorrow I may feel differently. Tomorrow I may be interested in something else entirely.  The labels they suggest are too personal, if they had suggested areas of interest then I may have ticked some, or even better, suggest areas I am not interested in.  I feel as if they are coercing me to give them information by using clever psychological tricks, suggesting I am shy or that I want to know how much / little influence I have with others.]]></description>
		<content:encoded><![CDATA[<p>Hi Jeremy,<br />
&#8220;what are the implications of selecting one of those options, and saying ‘yes I am this category’.&#8221;<br />
I find the whole labelling thing really invasive; I don&#8217;t want some large organisation having this information about me. I want to control my data, and handing it over feels like I am giving them some of the control.  I know this will be used recommend talks to me, but I am more than capable of finding interesting stuff for myself, indeed I like the serendipitous element of browsing randomly and following links (that&#8217;s how I found the MScDE course in the first place!). Life works well for me when I leave some things to chance.<br />
I can&#8217;t label myself &#8211; tomorrow I may feel differently. Tomorrow I may be interested in something else entirely.  The labels they suggest are too personal, if they had suggested areas of interest then I may have ticked some, or even better, suggest areas I am not interested in.  I feel as if they are coercing me to give them information by using clever psychological tricks, suggesting I am shy or that I want to know how much / little influence I have with others.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Jeremy Knox</title>
		<link>https://edc15.education.ed.ac.uk/jdarling/2015/03/07/via-nprnews-should-we-be-wary-of-algorithms-httpt-co7y9eu4mcwk-mscedc/#comment-180</link>
		<dc:creator><![CDATA[Jeremy Knox]]></dc:creator>
		<pubDate>Tue, 10 Mar 2015 14:56:48 +0000</pubDate>
		<guid isPermaLink="false">http://edc15.education.ed.ac.uk/jdarling/?p=586#comment-180</guid>
		<description><![CDATA[Really interesting example here Jin, and brilliant to link this directly to the Slavin TED Talk. 

One of the things you highlight here is the kind of trade that is happening with our personal data on the web. The &#039;Share this talk and track your influence&#039; caption seems to acknowledge that this isn&#039;t about something underhanded - we know that we&#039;re giving away our personal data to algorithms, so the incentive is that we get something back to compensate. The more you give away, the more you supposedly gain from the loss - in this case the gain seems to be viewing how much attention you&#039;ve gained from sharing the video. 

The &#039;I am...&#039; screenshot is fascinating isn&#039;t it? I wonder if you could say more here in terms of the &#039;you loop&#039; idea. While that is quite a long list, what are the implications of selecting one of those options, and saying &#039;yes I am this category&#039;. Is it just about what information it might feedback in the future, or also something more about our own identity formation?

So, is the solution to &#039;resist&#039;? That seems to imply that we get nothing in return, or at least that what we get in return is not as valuable as what we give away. I wonder if resistance could be interpreted as selfish in any way? For example, if these algorithms potentially improve information retrieval for &#039;society&#039;, then our non-participation would limit (at least the diversity) of what these systems could achieve with more data?]]></description>
		<content:encoded><![CDATA[<p>Really interesting example here Jin, and brilliant to link this directly to the Slavin TED Talk. </p>
<p>One of the things you highlight here is the kind of trade that is happening with our personal data on the web. The &#8216;Share this talk and track your influence&#8217; caption seems to acknowledge that this isn&#8217;t about something underhanded &#8211; we know that we&#8217;re giving away our personal data to algorithms, so the incentive is that we get something back to compensate. The more you give away, the more you supposedly gain from the loss &#8211; in this case the gain seems to be viewing how much attention you&#8217;ve gained from sharing the video. </p>
<p>The &#8216;I am&#8230;&#8217; screenshot is fascinating isn&#8217;t it? I wonder if you could say more here in terms of the &#8216;you loop&#8217; idea. While that is quite a long list, what are the implications of selecting one of those options, and saying &#8216;yes I am this category&#8217;. Is it just about what information it might feedback in the future, or also something more about our own identity formation?</p>
<p>So, is the solution to &#8216;resist&#8217;? That seems to imply that we get nothing in return, or at least that what we get in return is not as valuable as what we give away. I wonder if resistance could be interpreted as selfish in any way? For example, if these algorithms potentially improve information retrieval for &#8216;society&#8217;, then our non-participation would limit (at least the diversity) of what these systems could achieve with more data?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: Week 9: learning analytics and calculating academics &#124; MSc in Digital Education at the University of Edinburgh</title>
		<link>https://edc15.education.ed.ac.uk/jdarling/2015/03/07/via-nprnews-should-we-be-wary-of-algorithms-httpt-co7y9eu4mcwk-mscedc/#comment-161</link>
		<dc:creator><![CDATA[Week 9: learning analytics and calculating academics &#124; MSc in Digital Education at the University of Edinburgh]]></dc:creator>
		<pubDate>Mon, 09 Mar 2015 10:51:38 +0000</pubDate>
		<guid isPermaLink="false">http://edc15.education.ed.ac.uk/jdarling/?p=586#comment-161</guid>
		<description><![CDATA[[&#8230;] to Netflix, Clare’s tiki-toki timeline gave a terrific sense of her wider social media ecology, Jin applied these ideas to TED, while Nick’s play in Audible incisively raised some of these ideas in relation to the formation [&#8230;]]]></description>
		<content:encoded><![CDATA[<p>[&#8230;] to Netflix, Clare’s tiki-toki timeline gave a terrific sense of her wider social media ecology, Jin applied these ideas to TED, while Nick’s play in Audible incisively raised some of these ideas in relation to the formation [&#8230;]</p>
]]></content:encoded>
	</item>
</channel>
</rss>
