28 Oct 2016

The Common Good

The Robotics Law Journal attended the Cybersecurity Ethics conference at the University of Hull in October and reports some key findings.

By Tom Dent-Spargo


At a conference on Cybersecurity Ethics1 at the University of Hull, some major discussions relating to privacy, security, and surveillance, on a personal level as well as at the state or corporate level, emerged. It is now an issue beyond just mere state surveillance as big corporations such as Google and Apple are amassing huge swathes of our personal data, helped along by our own willingness to give away a lot of this data without much thought. With drones now being bought en masse, surveillance in both the physical and cyber realms is of vital importance to our lives, and in need of a visible debate if a legal framework to regulate it is to be formed. Some of the speakers’ findings are summarised here.

Risk Media

The first keynote speaker was Professor Andrew Hoskins, from the College of Social Sciences at the University of Glasgow. He posits that the risks posed by the media today are informational in their nature.

Loss of anonymity is a new media risk that has been formed by the requirement of digital participation to uphold self-identity and basic sociality.

In what is defined as “mediality” there is a dual compulsion of connectivity and a compulsion of recording everything. The act of recording has become more urgent than experiencing that which is being recorded, demonstrated ably by the sea of phone screens seen at music concerts. Comfort of immediacy and illusion of control belies a lack of consciousness in these media practices.

With digitally fostered values of openness – right to comment; open access; freedom of information; immediacy of instant search; confessional culture/self-disclosure; and so on – the archival nature of digital communication becomes significant. With a reduced decay time of content and the immediacy of being able to communicate and attain data, anonymity and privacy is threatened. Even in many privacy agreements, the spreadability of an image reduces its right to privacy and anonymity. If it is shared, it is no longer owned by anyone and cannot be claimed under any meaningful sense.

One of the effects is the acceleration of emergence and thus uncertainty. There is a massively increased potential for media data to literally emerge; to be discovered and/or disseminated instantaneously at unprescribed and unpredictable times after the moment of its recording, archiving, or loss, which can then transcend and transform that which is known or thought to be known about a person, place or event.

Ultimately it comes down to a matter of privacy versus secrecy. The defence of privacy follows – and never precedes – the emergence of new technologies for the exposure of secrets. The case for privacy always comes too late, and invites the dangers of the potential uses and abuses of digital pasts.

The Common Good

Professor James Connelly, of the University of Hull and Principal Investigator of this project, defined the common good in the context of the digital age and assessed its suitability as a guiding principle. 

One of the main issues is that it was developed as a principle in a world that was rapidly disappearing as the digital age loomed. Indeed, critics of the idea of the Common Good far precede the digital age. Their focus has been on the ambiguity of the notion, even going so far as to say that all goods are private, although that would fail to explain the act of sharing. 

Due to there being two distinct traditions of the Common Good — one substantive, the other deliberative — there is a lack of a unified understanding of the notion. Combined with the issue above, the challenge that faces the common good now is how easily and successfully it can be adapted into cyberspace. 

Dr Mike Brayshaw, a computer scientist at Hull University, examined perceptions of privacy, surveillance, and security, among undergraduates. Conducting a survey of Computer Science students at the University of Hull and Media and Communication students at the University of Leicester, comprising 31 questions on their attitudes and actions relating to privacy and security online. The project seeks to answer the question of whether privacy is a right or have surveillance and artificial intelligence overtaken such notions, in light of how social interactions have a recorded online permanence, that can be observed by other agencies.

Overall, the students appeared to be concerned with these issues, even if they hadn’t then taken any appropriate measures to better their security. Also telling was that the privacy of others didn’t seem to be of huge concern to them, which should be of interest to them under the definition of the Common Good.

Surveillance Realism

Dr Lina Dencik, from Cardiff University and co-investigator of the ESRC-funded project “Digital Citizenship and Surveillance Study”, highlighted how mass surveillance has been enabled and advanced through policy and technological frameworks, while being justified and normalised in public debate after the Snowden leaks. 

Everyday infrastructures were key in allowing such leaks to happen. With everything that we say and write being recorded online, we have entered a period of normalised surveillance, each of us being watched in some capacity at any time. Public discussions on state security have focussed and been defined by the political elite, creating barriers between the public and the issues of surveillance. Focus groups of the British public demonstrated a general lack of knowledge about the Snowden leaks – often confusing it with Wikileaks, and Edward Snowden with Julian Assange — as well as the fact that data surveillance was invisible to a lot of people. 

The conclusion drawn is that the Snowden leaks haven’t clarified to the public how surveillance is carried out, meaning that people haven’t changed their behaviour online, by using privacy enhancing tools for instance. Because the visibility of surveillance is so low, or that it is perceived to be too difficult to keep up with the relentless barrage of it, there is a public resignation over the handing over of personal data, when a public consent is what should be given. Case in point, who actually reads the terms and conditions of an iTunes update?

Dencik defines “surveillance realism”, as a pervasive atmosphere that regulates and constrains thought and action in which it has become increasingly difficult to imagine a coherent alternative to the prevailing system.

She advances a framework of  “data justice” as a way to articulate data-driven surveillance in relation to economic and social justice rather than the limited techno-legal narratives that have dominated data debates post-Snowden. Such a framework is needed to both understand the full implications of data-driven surveillance as well as asserting an alternative of the commons.

Cyber Power

Dr Brandon Valeriano (Cardiff University) spoke about strategic competition in the cyber domain, a highly important military and diplomatic innovation of recent times. Questioning the notion of a cyber revolution as being premature, especially given how little we still understand or can perceive the cyber world and what forms cyber attacks take. The paper, titled Cyber Victory: The Efficacy of Cyber Coercion, reviews how state actors used the cyber instrument of power from 2000 to 2014.

Coercion is the threat of the potential for escalating damages which forces an opponent to yield. Translated into the cyber world, it transpires that you cannot get something you want with cyber means unless what you want is information. Cyber coercion can only achieve cyber means and not affect the physical world as a result of threat of force. Examining cyber coercion is important though, as the nature of cyber weapons is still unclear, as is its usefulness in the cyber theatre. 

Through case examples of the Sandwort in the Ukraine, Student in Iran, Shaman, and others, the efficacy of cyber incidents is highlighted. Cyber operations are found to only be effective when they do not seek to alter behaviour of the targeted actor. So far, they only have managed to change behaviour in the context of cyber hygiene, rather than in any significant physical manner. The ineffectiveness of cyber coercion challenges the idea of a cyber revolution that has altered the nature and character of strategic competition and warfare.

Richard Hallows, a Doctoral Researcher in the Buckingham University Centre for Security and Intelligence Studies (BUCSIS), outlined the role of the newly opened National Cyber Security Centre (NCSC) in the UK as part of a £1.9 billion 5-year government cyber security plan. 

Presented a being more of an organisational realignment than anything new, there has been little media coverage so far, with few details made publicly available. Within the prospectus that was released, there are signs that the NCSC may represent the beginning of a more aggressive and interventionist approach to cyber security from the UK government, trying to consolidate public and private cyber capabilities.

With the NCSC staff being proposed to be seconded to private corporations, and to help design and test networks, the intrusiveness of cyber security capabilities is being shown. Overall, one of the issues with the NCSC is the blurring of government and private enterprises’ accountability. 

Ostensibly, the NCSC is a private sector-oriented institution, but is beginning to appear to be a part of GCHQ instead. Initially stating that it would report to GCHQ, it has instead claimed to be an open part of GCHQ, a subtle but significant difference. With a lack of engagement so far with the private sector, it can be viewed as a consolidation of power by GCHQ and instead of enhancing the cyber capabilities of the UK, it is extending government and control into private cyber capabilities. As a result, a number of conflicts and organisational complexities are arising that could hinder its efficacy in delivering improved cyber security.

1. Part of the ESRC-funded project “The Common Good: Ethics and Rights in Cyber Security”

This content is available to subscribers only. To continue reading...

Sign in to your account

Take a one-month free trial

If you aren't a subscriber, please sign up for a one-month free trial to access all Robotics Law Journal content, including:

  • All premium online content
  • Daily newsletters
  • Breaking news alerts

If you require further information, please email subscriptions@roboticslawjournal.com or contact call us on +44 (0) 20 7193 5801.