Yesterday the University of Reading published the big news: After 65 years of trial and error the first machine has won the Turing Test. This is big news and it means big responsibility! System designers and engineers must now – more than ever – think about the ethics of the machines they build.
I welcome the news with mixed feelings. I am thrilled on one side. And on the other side I feel that the computer science world is not ready for the responsibility ahead. Notably, I see very smart and influential people in the legal, political and economic field recently talking (publicly!) about the outdatedness of informed consent; its practical and technical infeasiblity. Informed consent is that “stubborn” European insistence that legally requires machines to ask for our permission before they are allowed to process our personal data.
Now, in the face of Turing level systems, seriously, I more than ever want the machine to ask for my permission. And if it is only for me to know that it is a machine I am talking to. I want to know what information that artificial being en face knows about me. And I want to withdraw from it the permission to use my information about me at any time. If the informed consent is taken away from us humans now, then “Good Night Mary”, as an old German saying would coin it. The technical architectures of the super intelligent system world coming would go in the wrong direction.
Unfortunately, powerful people have recently started to argue for the abolishment of our consent to personal data processing. I am not sure whether this is for power reasons (on the side of policy makers and companies), laziness and inflexibility (on the side of engineers) or simply extreme nativity and incapability to understand machines (mostly on the side of lawyers). Whatever it is it, abolishing consent is wrong and it is dangerous.
Technically giving consent can be „as reliable and easy as turning on a tap and revoking that consent as reliable and easy as turning it off again’’ (Edgar Whitley, London School of Economics, (Whitley, 2009)). So lets become a bit more frank technically for all those who still don’t want to believe this message and confront me with this outmoded and 1990s argument that “huge piles of consent forms are just too complicated for people …”:
Timely technical proposals foresee that Internet browsers serve as mediators between the “intelligent” infrastructure and us (see i.e. (Langheinrich, 2003, 2005; Spiekermann, 2007). In the near future our browsers can become more sophisticated personal software agents. They learn and store our privacy preferences and then permit or block requests to collect data about us automatically. Requests for our data as well as data sharing is logged (on our) the client side (Danezis, Kohlweiss, Livshits, & Rial, 2012), as well as with the requesting data collecting entities. The agreed data exchange terms and conditions enter a kind of “sticky policy” that is attached to the data collected form us (Casassa Mont, Pearson, & Bramhall, 2003). These policies travel as metadata-tags with our information into the databases of data controllers and processors who then need to comply to what extent and under what conditions we allow them to use our data (Nguyen, Haynes, Maguire, & Friedberg, 2013). Policies either deny any further use of our personal data (opt out) or allow for it (opt in). Policies may also detail a more elaborate set of specific privacy preferences with the help of protocols similar to P3P (Cranor et al., 2006)). Consent may also be withdrawn again or granted by us dynamically at later points in time (Kaye et al., 2014).
The true story is that all of these processes and technologies to manage our privacy are around for some time, but the industry has fought bitter battles to not use them (who is surprised?). Part of that battle is to tell some faithful politicians and lawyers that consent would simply overwhelm people and be so hard to technically implement… a blank lie.
Time is right therefore for the regulator to step in and mandate – ideally in the forthcoming European Data Protection Regulation - that companies have to adhere to the consent information we send them. They need to be regularly audited for this adherence and must be obliged to store our policy meta-tags with the data we send them. (Sarah Spiekermann, 10.6.2014)