19.01.2014 Views

Navigating the Dataverse: Privacy, Technology ... - The ICHRP

Navigating the Dataverse: Privacy, Technology ... - The ICHRP

Navigating the Dataverse: Privacy, Technology ... - The ICHRP

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

human rights prohibitions (on race, etc.). 186 On <strong>the</strong> o<strong>the</strong>r hand, “profiling” itself is clearly<br />

not prohibited by <strong>the</strong> Data Protection Directive: just <strong>the</strong> reverse.<br />

After all, <strong>the</strong> whole point of creating profiles is to discriminate. Profiles are a form of<br />

discrimination. <strong>The</strong> question that tends to arise (<strong>the</strong> next area of anxiety with regard to<br />

privacy and dataveillance we will examine) is whe<strong>the</strong>r <strong>the</strong> extensive profiling sanctioned<br />

by data protection rules is a form of discrimination we should care about. This, essentially,<br />

is <strong>the</strong> <strong>the</strong>sis of two important interventions in <strong>the</strong> surveillance studies debate, Oscar<br />

Gandy’s 1993 Panoptic Sort and David Lyon’s 2003 Social Sort. 187 According to <strong>the</strong><br />

latter, “surveillance today sorts people into categories, assigning worth or risk, in ways<br />

that have real effects on <strong>the</strong>ir life-chances. Deep discrimination occurs, thus making<br />

surveillance not merely a matter of personal privacy but of social justice”. 188<br />

This is a strong claim. Is it correct? Examples from Lyon’s edited volume include <strong>the</strong> role<br />

of CCTV in segregating neighbourhoods; 189 <strong>the</strong> role of Computer-Based Performance<br />

Monitoring (CBPM) in keeping workers stratified and in line; 190 <strong>the</strong> role of DNA<br />

databases in driving up health insurance costs for vulnerable individuals; 191 and <strong>the</strong><br />

(historical and potential future) role of ID cards in enforcing or preserving patterns of<br />

ethnic discrimination or discrimination against immigrants. 192 <strong>The</strong>re is no question that<br />

<strong>the</strong>se issues are significant. Even where surveillance merely serves to tag <strong>the</strong> income<br />

categories of motor vehicles (for example) it can contribute to social stratification. 193<br />

Even if dataveillance facilitates certain kinds of discrimination, as it surely does, is it<br />

correct to view it as a cause of discrimination. In each of <strong>the</strong> above cases, <strong>the</strong> social sort<br />

appears to increase <strong>the</strong> efficiency of forms of discrimination and segregation already<br />

practiced. Moreover, not only are <strong>the</strong>y practiced, but in most cases <strong>the</strong>y are legal, at<br />

least according to human rights law as generally practiced. (Discriminating against<br />

“socio-economic categories” is not only legal, it is <strong>the</strong> basis of <strong>the</strong> “price mechanism”<br />

itself.) In this area, human rights law, by delegitimising some kinds of discrimination,<br />

arguably legitimates o<strong>the</strong>rs.<br />

In this area, human rights law, by delegitimising some kinds of discrimination,<br />

arguably legitimates o<strong>the</strong>rs.<br />

186 Such cases would appear to fall in principle to Europe’s o<strong>the</strong>r court (ECHR cases known as “Arts. 8 +<br />

14”, where Article 14 protects against discrimination). See Julie Ringelheim, “Processing Data on Racial<br />

or Ethnic Origin for Antidiscrimination Policies: How to Reconcile <strong>the</strong> Promotion of Equality with <strong>the</strong> Right<br />

to <strong>Privacy</strong>?” Jean Monnet Working Paper 08/06.<br />

187 Gandy (1993) and Lyon (2003).<br />

188 Lyon (2003), 1. Lyon adds: “surveillance ... is a powerful means of creating and reinforcing long-term<br />

social differences.”<br />

189 Clive Norris, “From personal to digital: CCTV, <strong>the</strong> panopticon, and <strong>the</strong> technological mediation of suspicion<br />

and social control” in Lyon (2003); Francisco Klauser, “A Comparison of <strong>the</strong> Impact of Protective and<br />

Preservative Video Surveillance on Urban Territoriality: <strong>the</strong> Case of Switzerland”, 2 Surveillance & Society<br />

145 (2004); Ann Rudinow Sætnan, Heidi Mork Lomell and Carsten Wiecek, “Controlling CCTV in Public<br />

Spaces: Is <strong>Privacy</strong> <strong>the</strong> (Only) Issue? Reflections on Norwegian and Danish observations” 2 Surveillance<br />

& Society 396 (2004).<br />

190 Kirstie Ball, “Categorising <strong>the</strong> workers: electronic surveillance and social ordering in <strong>the</strong> call centre” in<br />

Lyon (2003).<br />

191 Jennifer Poudrier, “‘Racial’ categories and health risks: epidemiological surveillance among Canadian First<br />

Nationals” in Lyon (2003). Though discrimination of this sort might as easily be attributed to <strong>the</strong> absence of<br />

universal health care: a non-universal system must presumably discriminate from <strong>the</strong> outset.<br />

192 Felix Stalder and David Lyon, “Electronic identity cards and social classification” in Lyon (2003).<br />

193 Colin Bennett, Charles Raab, and Priscilla Regan, “People and place: patterns of individual identification<br />

within intelligent transport systems” in Lyon (2003).<br />

58 <strong>Navigating</strong> <strong>the</strong> <strong>Dataverse</strong>: <strong>Privacy</strong>, <strong>Technology</strong>, Human Rights

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!