There have been countless stories of joke terms hidden in Terms and Conditions for WiFi services, to prove that no one reads them when signing up. People have given platforms such as Facebook unprecedented access to their data in exchange for services, little knowing the power they had signed away. It’s simply impossible to read every single set of Terms and Conditions that one is face with online, and tis public resignation has been taken as consent, which is a troubling notion.
Earlier this year, the world was rocked by the Cambridge Analytica scandal, involving Facebook and access of data, bringing the issue of data ownership into the spotlight. Not only did Cambridge Analytica gain access to user data through an app in Facebook, it also accessed data to friends of the user. With the European Union’s General Data Protection Regulation (GDPR) coming into effect on 25th May, there has been a strong crackdown on rogue access of data, while the US quietly passed a piece of legislation in the wake of the scandal designed to give law enforcement agencies more access to citizens’ data in the name of security.
Cambridge Analytica Scandal
Few had heard of Cambridge Analytica until 2018, now very few haven’t heard the name and relate it to one of the biggest scandals involving data ever. A whistleblower – a former employee at the London-based company – revealed that data from as many as 87 million Facebook users was obtained through the use of an app on the social networking site, dating back to 2014. Not only did it acquire the data of the user of the app, but also anyone on their friends list who obviously did not consent to share their data. The larger implications of this breach are that the data could well have been used to influence the Brexit vote and the recent US election, as profiles of voters were built up from this data so that public opinion could be altered in a targeted manner.
Mark Zuckerberg face a congressional testimony hearing to go through in detail the leaks and how the company collects and uses data, as well as how much Facebook knew of the Cambridge Analytica scandal. Zuckerberg made sure to point out that Facebook does not sell user data – instead it collects data and gives advertisers access to Facebook users that are relevant to their products or services. The committee also asked how Facebook controls content on the platform, specifically how it addresses hate speech. Zuckerberg pointed to increased deployment of AI tools to battle hate speech as well as hiring more non-native English speakers to review content in other (non-English speaking) nations.
Apple made quick moves to distance itself from this scandal, recognising the reputational damage caused to the tech industry as a whole. Quick to promote how Apple makes its money from products rather than people, CEO Tim Cook made the bold claim that if he was the CEO of Facebook during the scandal, he wouldn’t have been in the same situation. Latest product updates from Apple attest to this renewed image of requesting less data, with browser Safari’s new anti-tracking tech placed in direct contrast to Facebook’s data accumulation.
Stronger regulations over the tech industry could help force standards of ethical use of customer data, with lawmakers taking a more direct role instead of letting companies essentially self-regulate. But what form should these regulations take? How strong should they be?
The CLOUD Act
The US government quietly passed a piece of legislation addressing data usage. The Clarifying Overseas Use of Data Act (CLOUD Act) is in essence an update to the Electronic Communications Privacy Act (ECPA), a series of laws regulating how US law enforcement agencies are allowed to handle and access data that is stored overseas. The ECPA was passed in 1986 so it’s not unsurprising that an update was deemed necessary.
Before the CLOUD Act, the US could only access data stored overseas through Mutual Legal Assistance Treaties (MLATs), whereby two or more nations state, in writing, how they will help each other with legal investigations. Each MLAT must be voted on by the Senate, requiring a two thirds majority of approval to pass. Under the CLOUD Act, US law enforcement officials can force tech companies to turn over user data, regardless of the location that the companies stores the data. It also allows the executive branch to strike up “executive agreements” with other nations, allowing each nation to access data stored in the other country, in spite of any national privacy laws.
This startling increase in data access for US law enforcement seems to have slipped under the radar. That’s because it wasn’t voted on as its own legislation. Instead it was written into an omnibus spending bill, which received approval overall meaning all the attached measure did too. It appears that US legislators have decided that citizens’ safety is more important than their privacy and their right to control their own data. This attitude has seen some notable backlash in recent times, centred on the Snowden leaks.
Alastair Johnson, CEO and founder of commerce and ID payments platform Nuggets spoke to Tom Dent-Spargo about the CLOUD Act and wider issues of privacy and security.
RLJ: Are there any benefits to the CLOUD Act for citizens and consumers?
Alastair Johnson: Broadly speaking, the CLOUD Act offers little benefit to citizens and consumers, beyond those that trickle down from government decisions.
RLJ: What are the international implications of the legislation?
AJ: Although passed in the US, the legislation’s far reaching effects grant the nation powers that are surely controversial in the context of international law. Simply put, it allows foreign law enforcement to gain access to user data held by technology companies.
RLJ: Is the method of pushing this bill so quickly (adding it to an omnibus bill at the last moment, denying the chance for proper debate) concerning?
AJ: It should absolutely worry anyone. The bill comprised of a staggering 2000+ pages, and yet had to be voted on within 24 hours of its release. It seems as though the government sensationalise minor issues, while letting hugely important ones such as these pass into effect under the radar.
RLJ: Is personal privacy more important than safety of citizens when it comes to digital information?
AJ: The question of “liberty versus security” is an incredibly contested one. We see a lot of legislation being passed in the name of national security, but most will agree that sweeping mass surveillance isn’t the answer. It’s common sense that personal information should be owned and controlled by the individual, and not a business. The amount of sensitive data produced by an individual nowadays is far too abundant, and one should maintain sovereignty over this instead of trusting a centralised database with it.
RLJ: What approach would you recommend is taken with regards to personal data regulation?
AJ: I like the approach being taken with GDPR. I think legislators need to appreciate the value of data, now more than ever. While regulation is certainly a fantastic way to go, there’s been a fundamental shift in how personal information can be stored – advances in decentralised storage, blockchain tech and biometrics show promise in developing such solutions so that, even when regulation fails, user data remains encrypted and inaccessible to unauthorised users.
I think the CLOUD Act demonstrates further that the user (People) should own and control their own data. It’s not good for businesses to be using and abusing personal information as well as others. Ownership of one's own personal information does not suggest a need or want to cause harm or fraud but is a a common sense principle as its their information. The biggest problem appears to be when third parties are involved with other people's personal information, if we can remove this then we should all be in a better place.