BERLIN – JUNE 2014
Privacy and Internet Governance MENT & PARLIAM E NT E RN V GO of the German Bund r e b e em
ommercia l Stakehold on-C er s G –N f the Steering r r in Member o Com roup a Pe m t ie I ak – i nciples Coa lition , To ttee, CA an amm & Pr i hts k N I ph n te r R ig me-Pa lasí – Col laborato yo te k D u ne r y , Be rli n
S
R
afi
Lo
ren
Peter Schaar Chairman of the European Academy for Freedom of Information and Data Protection
Ri
r ch ge ana rce n n a rd H m il e CT Fo M Soini l – Author, former I ask it y T ic h ne n er s – Internet Engineering a el n iv K om U h rc a r ow – Nationa l Resea H ig TE h s c ow e r School o CH IT Y s, Mo c i m o f n o E c N UN Jo
ICA
t
PROPOSITION
CI V IL SOCIE T Y
E SECTOR PRIVAT
a Ja
ENA Telecommunicatio ns SA M – a t – Computer & Com Coun lam on-Ha r m ci a uni t S c ia cat l, Du ge k Ash dustr y Asso tion, Genev r ion a In e o N ic s ba i G ehmel – BITKOM G D e e n rm an sa n y Su
st a g M (Th te – s – Former Nor wegian Data t i eL S e I ns n a e C o – e u i n r p k c s p i l w t e o o f c A n e Eur tor ft) a li P or g o M p a rmation So e, te e Jan ead of Info ciet G y H
N
#7
MM L & AC ADEMIC CO
A publication by the Internet & Society Collaboratory
Editor - Wolfgang Kleinwächter
Internet & Society
Co llaboratory
#7
Privacy and Internet Governance
A publication by the Internet & Society Collaboratory Editor 路 Wolfgang Kleinw盲chter 1st Edition ISBN 978-3-00-046186-6
Contents 05 Internet Governance for the Cloud Society · Preface
06 Wolfgang Kleinwächter · Editorial
PROPOSITION
RESPONSES GOVERNMENT & PARLIAMENT
14 Peter Schaar · The Internet and Big Data - Incompatible with Data Protection?
20 Jan Malinowski · Big data: a challenge to privacy, a threat to society, an opportunity. Should we trust businesses with our privacy online or look to the state for protection? 25 Georg Apenes · Switching Off the Age of Enlightenment? 27 Petra Sitte · Big Data and Big Government necessitate a paradigm shift
64 Authors
68 About the Internet & Society Collaboratory
10 Abstract
RESPONSES PRIVATE SECTOR
RESPONSES CIVIL SOCIETY
RESPONSES TECHNICAL & ACADEMIC COMMUNITY
33 Nick Ashton-Hart · The Internet is not incompatible with data protection, but the debate we currently have about privacy largely is
44 Stephanie Perrin · The Internet and big data – incompatible with data protection? We don’t think so! A civil society perspective
56 Jonne Soininen · The Current State of Internet Security From A Technical Perspective
37 Susanne Dehmel · Modernizing data protection along with data processing technologies
49 Rafik Dammak · The need for versatility in data protection
59 Michael Komarow · Big Data leads to new international data processing policies
51
61
40 George Salama · Big Data: An Opportunity Combined With Privacy Concerns. A Regulatory Perspective
69 MIND needs your support
Lorena Jaume-Palasí · Is data protection becoming incompatible with communication?
70 Previous Issues and Authors of MIND
Richard Hill · Schaar is both profetic and mainstream
72 Imprint
Credit: Kmeron | https://flic.kr/p/aSg8g6 CC BY-NC-ND 2.0 | https://creativecommons.org/licenses/by-nc-nd/2.0/
PREFACE
Preface Internet Governance for the Cloud Society The discourse on Internet Governance has reached an inflection point. It has become clear what is really at stake for our societies. Billions of individuals spend a considerable part of their lives online: we communicate and work, we shop and study, we discuss and argue via the Internet. This development is unstoppable and it is changing our society. However, the Internet does not only have an ever growing impact of the lives of individuals, but it increasingly shapes the life of organizations alike. Nations and corporations are also online, administering, governing, doing business and enunciating their interests. Internet governance, therefore, is subject to a complex global struggle for power in the information age. It is our challenge to reconcile the desire for user data by industry and government with the fundamental rights and basic principles of civil rights and privacy. At the same time, the open and unrestricted character of the Internet needs to be preserved, as this openness drives
knowledge, progress, growth and vital infrastructures not just in the so-called first world. Privacy, however, is impossible to protect and strengthen without true global cooperation and willingness by governments and corporations alike. Internet Governance used to be about domain names and IP addresses. Nowadays, we need to build a global consensus on how to translate concepts of privacy, democracy, freedom and security into a world where big data, ubiquitous access, and the Internet of things transform how we live. With this issue, we hope to contribute to this debate. We must ensure that the Internet benefits society. The Collaboratory steering group Martin G. Löhe (chair), Dr. Marianne Wulff (vice chair), Dr. Michael Littger, Lena-Sophie Müller, Dr. Philipp S. Müller
7
Editorial Internet Governance and Privacy PROF. DR. WOLFGANG KLEINWÄCHTER, EDITOR
Was there privacy in ancient times and in the Middle Ages? Whole tribes lived under one roof, and in a village everybody knew everything about everybody. If you go to the ruins of the old Roman city of Pompeii, you will learn that even the restrooms were public spaces.
argue that the 21st century will see the “end of privacy”. Are we moving backwards into something like the “digital Middle Ages”?
Today, privacy is seen as a fundamental individual human right, protected by Article 12 of the Universal Declaration of Human Rights which states: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.”
The understanding of privacy as a legal right has its own history. It goes back to a case from the 17th century – known as the Semayne’s Case from 1604 – when a British lawyer, Sir Edward Coke, stated: “The house of every one is to him as his castle and fortress, as well for his defence against injury and violence as for his repose.” The Semayne’s Case acknowledged that the king did not have unbridled authority to intrude on his subjects‘ dwellings, but recognized that government agents were permitted to conduct searches and seizures under certain conditions when their purpose was lawful and a warrant had been obtained.
However, since the beginning of the Internet Age, we have seen growing unlimited access to all kinds of small and big personal data by transnational private corporations and governmental security agencies. Individual privacy is eroded and undermined. Private correspondence is checked by authorized or non-authorized parties. As soon as you are connected to the Internet via a fixed or mobile end device – whether it is in your private home or in your hotel room, if you are walking in the street or riding in a car – somebody on the other end of the line will know where you are, what you are doing, and what your plans will be. It is not only the usual skeptics who 8
HISTORY OF PRIVACY
This has later been taken as a blueprint by James Madison when he introduced the 4th amendment to the US Constitution in 1789: “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the
EDITORIAL
Credit: Ministerio TIC Colombia | https://flic.kr/p/bEjYer CC BY 2.0 | https://creativecommons.org/licenses/ by/2.0/deed.de
persons or things to be seized”. Later, in 1890, Samuel D. Warren and Louis D. Brandeis described privacy as the “right to be let alone”. The word “privacy” comes from the Latin privatus which means “separated from the rest”. The whole idea of the Internet is that we are connected, not separated, and that everybody can communicate with everybody anytime, anywhere. In the new virtual global village, we are all under one roof. Can we remain alone in cyberspace? Do we want to remain alone? How can protection work in a borderless space so that we as individuals are safe against unreasonable searches and seizures? How we can use the freedom we have won in the virtual world without risking losing our privacy if we use the Internet? This is a big question and finding the right answer is not easy.
Code makers work at a higher speed than law makers. As we have seen in the last decade, technology always develops faster than our legal system. Code makers work at a higher speed than law makers. In the information age, it is the code that defines the space in which law makers now operate. This brings a lot of new flexibility to the system. On the other hand, social values, individual rights, and personal freedoms do not change overnight when new technologies are introduced. Our legal system has a high degree of stability which is needed in a democratic society. What we have learned in recent years is that a lot of new Internet based services and applications offer new opportunities but very often do not need new regulations. They can be managed and dealt with on the basis of our existing legal system, both nationally and internationally.
computing nor the Internet of Things leads to the disappearance of universal values or human rights. In this respect, it was very natural that the UN Human Rights Council stated in a resolution from June 2012 that “the same rights that people have offline must also be protected online”.
THE UN RESOLUTION ON PRIVACY IN THE DIGITAL AGE
From a legal point of view, there is no difference between stealing money offline and stealing money online. Stealing money is a crime, and a crime is a crime is a crime, offline as well as online. Doing harm to other people remains illegal whether it is done in the real or in the virtual world.
This is also relevant for the right to privacy, as it was reaffirmed in the UN Resolution on the right to privacy in the digital age, initiated by Brazil and Germany and adopted at the 68th UN General Assembly in December 2013. The resolution notes inter alia that “the rapid pace of technological development enables individuals all over the world to use new information and communication technologies and at the same time enhances the capacity of Governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights, in particular the right to privacy, as set out in article 12 of the Universal Declaration of Human Rights and article 17 of the International Covenant on Civil and Political Rights, and is therefore an issue of increasing concern”.
Yes, there are new problems in borderless cyberspace. If providers and users of Internet based services operate under different jurisdictions, there is a pressure to “harmonize” national regulations or to decide which jurisdiction is relevant in a concrete controversial case. And yes, there are some new problems which have not yet been clearly defined in our traditional legal system, such as cloud computing or the linkage of objects to the Internet via interactive RFID chips. But neither cloud
This brings us to the question of whether all technologies that are invented and are available should be used in an unlimited way. There is a real question whether we need ethical, moral, and legal barriers for the use of certain types of technology. A person who owns a gun is not totally free to use this gun for everything. She or he has to respect concrete laws and if she/he ignores them and uses the gun against human beings, she/he will be punished and jailed. 9
In other words, we need restrictions on the use of communication technology which allows interference into our private homes, intrusion into our private communications, and surveillance of our day-to-day behavior by private or public parties, corporations, governments, or our unfriendly neighbors.
For a fair balance, we need the protection of the law There can be reasons for a justified interference. But this has to be the exception and cannot be the rule. And it needs to go through a legal procedure where a neutral third party, based on evidence of a clear and present danger, checks the necessity and proportionality of such interference. In other words, there will be no one-sizefits-all solution. It has to be decided on a case by case basis, taking into account the specific circumstances.
THE CHALLENGE TO FIND THE RIGHT BALANCE The big challenge here is to find the right balance. But one thing is also clear; this can´t be left to the “free market”, where the individual Internet user has no adequate negotiation capacity against big corporations or big governments. For a fair balance, we need the protection of the law. As Jean Baptiste Lacordaire, the French philosopher, stated nearly two hundred years ago: “Between the strong and the weak … it is freedom that oppresses and the law that liberates”. The 2013 UN Resolution on Privacy in the Digital Age is moving in the right direction here. The resolution reaffirms “the human right to privacy, according to which no one shall be subjected to arbitrary or unlawful interference with his or her privacy, family, home or correspondence, and the right to the protection of the law against such interferences”. It recognizes that “the exercise of the right to privacy is important for the realization of the right to freedom of expression and to hold opinions without interference, and one of the foundations of a democratic society”, and it emphasizes that “unlawful or arbitrary surveillance and/or interception of communications, as well as unlawful or arbitrary collection of personal data, as highly intrusive acts, violate the rights to privacy and freedom of expression and may contradict the tenets of a democratic society”. Furthermore, the resolution also notes that “while concerns about public security may justify the gathering and protection of certain sensitive information, States must 10
ensure full compliance with their obligations under international human rights law”. And it expresses its deep concern about “the negative impact that surveillance and/or interception of communications, including extraterritorial surveillance and/or interception of communications, as well as the collection of personal data, in particular when carried out on a mass scale, may have on the exercise and enjoyment of human rights”. It concludes that “States must ensure that any measures taken to combat terrorism are in compliance with their obligations under international law, in particular international human rights, refugee and humanitarian law”. This is clear and balanced language adopted by UN member states and supported by a wide range of nongovernmental stakeholders, in particular from civil society. To find the right balance not only among governments and stakeholders but also between justified security concerns and individual privacy rights is not easy, but we have to face this challenge in the digital age. The right answer can be found only in a bottom-up, open and transparent multistakeholder policy development process. In this respect, it is good that the resolution invites the governments of the UN member states “to review their procedures, practices and legislation regarding the surveillance of communications, their interception and the collection of personal data, including mass surveillance, interception and collection, with a view to upholding the right to privacy by ensuring the full and effective implementation of all their obligations under international human rights law” and to “establish or maintain existing independent, effective domestic oversight mechanisms capable of ensuring transparency, as appropriate, and accountability for State surveillance of communications, their interception and the collection of personal data”. However, such a call should go beyond relevant activities by the governments of the UN member states and should also include the private sector, civil society, and the technical community.
TOWARDS A MULTISTAKEHOLDER MODEL IN THE DEVELOPMENT OF PRIVACY POLICIES A lot of personal data and surveillance capacity is now in the hands of the private sector. While private corporations are obliged to respect the legislation of the country in which they operate, they often try to escape national legislation by “ jurisdiction shopping” – that is, to pick the country with the lowest standard of privacy laws as the place for starting business in borderless cyberspace. An inclusion of the private sector in a multistakeholder
EDITORIAL
process to develop policies to respect individual privacy rights is as important as bringing civil society directly to the negotiation table. Networks like Privacy International, Human Rights Watch, Reporters without Borders, Article 19, Transparency International, Consumer International and others have to have a voice and a vote when it comes to global mechanisms which will enhance the protection of privacy in the digital age. And even more important is the inclusion of the technical community. This community has developed standards which enabled surveillance and enhanced control capacities. This community, as IETF or W3C, is now challenged to offer standards which will allow a higher protection of individual privacy. Privacy by design is a very concrete challenge for the Internet standard setting organizations, in particular when it comes to the next wave of services and applications relating to the Internet of Things. Furthermore, it needs an enhanced understanding of the various elements of privacy protection and more specifications. When the Global Business Dialogue on eCommerce (GBDe) discussed privacy concerns in 1999, they differentiated between “sensitive” and “non-sensitive” data. For “sensitive” data (data related to health, finances, sexual orientation, religion, and political affiliation), they proposed that a corporation should ask the individual if the corporation wanted to use this data (opt in). For “non-sensitive data” (such as shopping behavior, travel, open chats, searches), they proposed that corporations could use the data as long as the individual did not express an explicit reservation (opt out). This approach was not further investigated or translated into concrete legislation. But it shows that a multistakeholder approach widens the perspective and can bring more and reasonable arguments to the negotiation table.
on how to enhance the multistakeholder model when it comes to policy development and decision making with regard to privacy issues in the Internet Governance Ecosystem. Principle 1.3 of the NetMundial Declaration says very clearly: “The right to privacy must be protected. This includes not being subject to arbitrary or unlawful surveillance, collection, treatment and use of personal data.” And the roadmap section of the Sao Paulo Declaration states: “Mass and arbitrary surveillance undermines trust in the Internet and trust in the Internet governance ecosystem. Collection and processing of personal data by state and non-state actors should be conducted in accordance with international human rights law. More dialogue is needed on this topic at the international level using forums like the Human Rights Council and IGF aiming to develop a common understanding on all related aspects.” This is a process and it will not be settled overnight. The next concrete step will be the report by the United Nations High Commissioner for Human Rights on “the protection and promotion of the right to privacy in the context of domestic and extraterritorial surveillance and/ or interception of digital communications and the collection of personal data, including on a mass scale, to the Human Rights Council at its twenty-seventh session and to the General Assembly at its sixty-ninth session (2014), with views and recommendations, to be considered by Member States” as it was decided by the UN General Assembly in 2013. There is still a long way to go. But the first steps have been taken. Do not expect big jumps. Let´s go forward by taking more small steps, but let´s move in the right direction.
To take another example: the German constitution has included – since the 1980s – the right to informational self-determination which gives all rights regarding how to use personal data to the individual. In the 1990s, the right to access the secret files of the East German secret service (Stasi) was seen as a constitutional right. Can such an approach be globalized? Is it the right of an individual to know what information secret services around the world have collected about her or him? May I ask the NSA whether they have looked at my private communication and if yes, what do they have in their database?
NET MUNDIAL In this respect, the final document adopted at the recent Global Multistakeholder Meeting on the Future of Internet Governance (NetMundial) can be a good guideline
11
PROPOSITION
Peter Schaar recognizes the urgency to look for convincing solutions for the emerging challenges regarding data protection. Nowadays we are facing a high risk that the fundamental right to privacy and other core values of Western democracies will be lost.
12
RESPONSES
GOVERNMENT AND PARLIAMENT
JAN MALINOWSKI focuses on issues where privacy is sometimes minimised by obscuring the rel-
evance of the individual or citizen, presenting persons as mere data subjects. He sees states are the duty bearers of human rights.
GEORG APENES
identifies chances personal information can give to projects and plans that are generally accepted as being constructed for the common good. Still nowadays there are no tools that can protect individual privacy.
PETRA SITTE
sees fatal impacts of Big Data and Big Government if not all EU citizens were treated as domestic residents. This would lend the EU a much stronger position in negotiations with the US Administration on dismantling the surveillance system. explains why the debate about data protection is incomplete and identifies false assumptions about the role of government, economy and competition law. He draws a line between governmental and economic use of personal data
TECHNICAL & ACADEMIC COMMUNITY
CIVIL SOCIETY
PRIVATE SECTOR
NICK ASTHON-HART
SUSANNE DEHMEL
raises the need for new methods of processing in order to cope with the existing and ever growing amounts of data we produce. She is positive that Internet and big data are compatible with data protection.
GEORGE SALAMA
demands from a Big Data policy framework to not hinder innovation and investments by giving operators and Internet service providers. At the same time privacy settings should be simplified and redesigned.
STEPHANIE PERRIN
discusses why the value of big data should not be accepted as a given and that a societal value still has to be proved. She proposes five initiatives on how to meet this challenge and respond to the “need for a broad coalition to defend” the values we cherish in the information society.
RAFIK DAMMAK
sees Big data as an evolution bringing new opportunities for businesses, but without a clear benefit for users. He explains how data protection and privacy can borrow the scalability principle in order to be able to handle the next technology threat to privacy. refers to an impetus to store. Human acquisitiveness towards information did not change, but the access to it did. Instead of concentrating on data minimization we should concentrate on the values resulting from data in need of juridical protection.
LORENA JAUME-PALASÍ
JONNE SOININEN states that the internet is more secure than ever since the Internet’s technical
community is developing technologies for increased security, whereas the public has created more awareness of privacy on the Internet.
declares that technological development has overtaken the policy-making process and applications according to web 3.0 are likely to be far more effective at piecing together personal data than even traditional search engines.
MICHAEL KOMAROV
RICHARD HILL
demands from parliaments to take action, both to stop mass surveillance by governments and to curtail the power of dominant service providers to obtain data from customers and use it as they see fit to generate large profits. 13
PROPOSITION
Proposition PETER SCHAAR, CHAIRMAN OF THE EUROPEAN ACADEMY FOR FREEDOM OF INFORMATION AND DATA PROTECTION
Credit: Heinrich Bรถll Stiftung | https:// creativecommons.org/ licenses/by-sa/2.0/ | CC BY-SA 2.0
The Internet and Big Data - Incompatible with Data Protection? PETER SCHAAR, CHAIRMAN OF THE EUROPEAN ACADEMY FOR FREEDOM OF INFORMATION AND DATA PROTECTION
16
PROPOSITION
The Internet is frequently used as a synonym for digital globalization. Today, data travels around the world within a split second. And big data represents a concept based on the idea of collecting as much data as possible – the more data that is collected, the better the concept works.
data is seen as an asset; the “new oil” of the information society. Big data approaches are looking primarily for correlations and not for reasons. No big data apologist would even understand the question of which purpose a specific piece of data is to be collected for.
Ideas and rules which have grown over decades and centuries must be scrutinized to determine whether they still fit into the brave new cyber-reality. This also applies to the protection of privacy. The current concepts of data protection date back to the Sixties of the 20th century. Since then, the world has changed dramatically, particularly in the field of information processing. Fifty years ago, most data was still processed manually. The few computers there were had very limited processing capabilities. Automated processing was carried out in data processing centers separated from offices and other workplaces. And cross-border data transfer was the absolute exception.
The privacy community faces the challenge of integrating the macro perspective into a modern legal, political and economic framework. More systemic, technological and procedural instruments should be added to the data protection toolbox without forgetting that the fundamental rights of the individual remain indispensable. The protection of privacy remains an expression of human dignity – it derives from a basic European constitutional understanding. That is not all; the individual has a right to informational self-determination so that he or she can freely develop his or her own personality. It is the individual who should basically determine which of his or her personal data is disclosed to whom and for which purposes it should be processed.
There is no question that – given rapid technological development – we urgently need to look for convincing solutions for the emerging challenges regarding data protection. Otherwise, the world will witness a further erosion of privacy. The advocates of privacy and fundamental rights must deal with the fact that some traditional rules of data protection are no longer effective in a world of ubiquitous and globalized information processing. In particular, the proponents of privacy have a vital interest in re-examining the current data protection regimes.
Mass data is seen as an asset; the “new oil” of the information society. Most of the current data protection rules and regulations focus on the individual procedure used for data processing. The starting point of legal assessment has been the specific purpose. Personal data may be collected “for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes” says Art. 6 of the European Data Protection Directive of 1995. The processing of data is permitted as long as it is adequate, relevant and not excessive in relation to the purposes for which the data is collected. The traditional key question is: which data is needed to fulfill a legitimate purpose? The main criteria are relevance and adequacy. In other words, data protection regulations consider data processing from a micro perspective: single pieces of data, an individual algorithm, a specific purpose. Today, companies and public bodies see data processing more and more from a macro perspective: how can data coming from various sources be used to better understand what‘s going on and to optimize procedures? In particular, mass
Personal data must not be seen as the property of the controller (or processor) who has the practical means to access the information. Up to now, data processing has been based on the consent of the individual “data subject” who always has the right to withdraw his or her consent. After withdrawal, the controller must stop the processing. Even if data is collected for a specific purpose, in particular within the framework of a contractual relationship, this is no carte blanche for the data controller. Purpose limitation of personal data remains a legal requirement even in a new technological environment. On the other hand, changes of purpose might be associated with smaller risks if the data is anonymized or pseudonymized. Regulators should c onsider setting up incentives for data controllers to remove identifiers from data sets. One example of this approach is the distinction between personalized and pseudonymized profiles in the German Telemedia act. If a provider intends to carry out profiling with personal data including personal identifiers, he needs the explicit consent (opt in) of all individuals concerned. If the profiling is to be carried out without identifiers, the provider needs to inform the data subjects and offer them the opportunity to object (opt out). The basic data protection principles – privacy and informational self-determination – need to be preserved and protected in a changing world. Legislators, notably parliaments, are called upon to draw the exact boundaries between acceptable and unacceptable use of personal data. At least in the first instance, this is not primarily the task of the courts. Parliaments are the first port of 17
call for interpreting the requirements set up by constitutions. They also need to define concepts and concrete rules regarding the implementation of the right to data protection. This applies to public sector as well as to non-public actors, especially where huge, almost marketdominating companies are concerned.
In a networked world, users need genuine alternatives much more than customers in other markets. What does that mean in detail? The “core area of private life” enjoys absolute protection under constitutional law, as the German Federal Constitutional Court pointed out in several rulings. Nobody has the right to cross this red line. It is not only eavesdropping on the bedroom by police or secret services that must be prohibited, but also the monitoring of highly personal, confidential electronic communications, as far as the core area of private life is affected. In this highly sensitive sphere, there is no justification for surveillance measures carried out by public authorities. Private companies are certainly not allowed to cross this red line either. Highly sensitive personal health data must not be used for advertising or for other commercial purposes without the explicit and voluntary consent of the individual concerned. Criminal penalties remain essential for the defense of these red lines. Fundamental rights must also be observed in the processing of data that, at first glance, appears less sensitive. For instance, in cases where data on purchasing behavior is used to determine the individual‘s health status, such as possible pregnancy, or political and religious views. These are not just the nightmares of a data protector; it is reality in the big data context, as shown by examples of American supermarket chains that operate in exactly this way. In addition, it is not an entirely new phenomenon. Even credit scoring systems used to assess the credit worthiness of a customer are mainly fed with “harmless” data – but the results might have serious consequences for the loan applicants. Big data approaches will raise more and more questions like this. These questions cannot be solved by mere prohibition of the generation of additional knowledge. The risks for privacy and individual self-determination coming from statistical analyses of mass data should be limited by setting up procedural requirements. Privacy Impact Assessments ( PIA) – the preventive, systematic assessment of the impact of specific technologies and the definition of protective measures – might be an important tool to prevent, or at least diminish, negative consequences. The proposed basic regulation for data protection tabled by the Euro18
pean Commission more than two years ago contains interesting approaches for PIA that are worth developing further. More transparency can help individuals to exercise their right to informational self-determination and to protect their privacy. Data controllers are obliged to provide potential users and customers with the relevant information. Extensive privacy policies which are as long as a novel, and which nobody reads, provide pretended transparency only. Instead, the core information must be easily accessible and understandable. Who is responsible? What data will be processed? For what purpose and where? The data protection authorities need to continue to verify that the information provided by the data controllers meets these requirements. In addition, violations and false promises could be punished by individual claims for damages and secured by a right to mount class actions, such as those well known in the area of consumer protection. In a networked world, users need genuine alternatives much more than customers in other markets. If you want to buy a car, you have the choice between a lot of different types and brands and make your purchasing decision independent of other people’s choices. In the market of interactive social networks, the situation is completely different: if all your friends are members of a specific service it is hard for you to leave, even if the provider changes his privacy settings in a way you do not agree with. In particular, the functioning of those markets that are fed by personal data is of great importance from a data protection perspective. The successful models of “free” services, mainly financed by targeted, personalized advertising, need to be brought under effective competition control. The user needs a free choice about which data he or she gives away to which provider under which conditions. The challenge of preventing monopolistic structures restricting the user’s autonomy in Web 2.0 services has not yet been tackled by law-makers. Even in this field, the EU data protection package might provide an element of the solution by constituting a “right to data portability”. The user of interactive services, such as social networks, needs the opportunity to retrieve and extract his or her data from one provider in order to process it on his or her own computer or to transfer it to another provider. In order to prevent discrimination, a “right to connectivity” should be considered as well. Big Internet services – at least if they are quasi monopolists in a specific class of services – should be obliged to accept those users who comply with the respective rules. Moreover, Web 2.0 services should provide open interfaces to enable communications between members and non-members.
PROPOSITION
Furthermore, there is the fundamental question of the integration of data protection in information technology – in software as well as in hardware. Hardware identifiers allowing third parties to track users without their consent must be avoided. Nobody can dispute the fact that it is incompatible with data protection and with fair information principles that some apps suck almost all the data stored on smartphones, even if this data is not required for the apps’ functionalities. This is only one example of the shortcomings in the field of technological data protection. There are many more: if you become a member of a social network, many privacy settings are switched off by default. If you start a web search, most search engine providers get much more personal information than they need for carrying out the search or optimizing the algorithms. This needs to be changed if we want to prevent privacy fading away. IT systems need to be designed in a privacy-friendly way, giving the individual maximum control of his or her own data. But it is not sufficient to provide the user with privacy controls. Another important question is how products and services are delivered. Today, nobody would accept cars with weak safety settings or with airbag sensors not activated. In the IT market, however, many products and services are delivered without, or with a low level of, privacy precautions. The keywords for new thinking in this area are “privacy by design” and “privacy by default”. Technological data protection may help to bridge the gap between conflicting interests. Many tasks may be realized with anonymous, or at least pseudonymous, data.
The legal framework for data protection should be both strong and flexible. Last but not least, it is unacceptable that governments and intelligence agencies are abusing the increasing international data transfer for bulk access to the transmitted data. As we have learned from the Snowden leaks, the NSA, GCHQ and other secret services collect as much data as they can. Legal safeguards focused on the protection of a country’s own people and limited to national territory do not protect personal data in the increasingly globalized world. As a consequence of the secret services’ “hunger for data”, the trust in Internet services has collapsed. In particular, some providers of services that are global by nature fear substantial losses. Private and commercial customers fear that the data they give to these services would be secretly accessible to intelligence services without sufficient safeguards. Technical and legal changes may help to reconstruct trust. Data encryption and secure routing of data packets on
the Internet should be promoted. On the other hand, there is a pressing need to change the legal frameworks for transborder data processing. The “national interests” exemption in almost all legal instruments for data protection concerning international transfers of personal data cannot be accepted any more. Additionally, there is a need to establish binding international data protection standards guaranteeing the protection of private life, as laid out in Art. 12 of the United Nations Charter of Human Rights. The legal framework for data protection should be both strong and flexible. Without robust enforcement, the legal requirements would remain theory. Without flexibility, every new technological development would undermine the rules and create the need for legal changes. The current German system of data protection is weak regarding enforcement, at least on the federal level. The Federal Commissioner for Data Protection has no power to fine and his decisions are not binding for providers of telecommunications and postal services falling under his supervision. On the other hand, there are a lot of federal laws with specific data protection provisions. This system is neither consistent nor flexible. The reform of data protection legislation on a European level should be seen as a chance to overcome these shortcomings. If the approaches to improve data privacy and data security on a European and an intercontinental level fail, there is a high risk that the fundamental right to privacy and other core values of Western democracies will be lost. As a reaction to the excessive surveillance programs of the NSA and GCHQ , several governments have started activities for strengthening national control over network infrastructures, as well as over Internet services. A negative effect of such efforts could be a balkanized Internet, split along national borders with censorship and thought control – and intensified surveillance by national authorities. The pursuits of liberty and prosperity, free discussion and inclusion, closely linked to the information society, are at stake. There is a need for a broad coalition to defend these values.
19
20
RESPONSES GOVERNMENT & PARLIAMENT
Responses Government and Parliament JAN MALINOWSKI, COUNCIL OF EUROPE, HEAD OF INFORMATION SOCIETY
GEORG APENES, FORMER NORWEGIAN DATA INSPECTORATE PETRA SITTE, MEMBER OF THE GERMAN BUNDESTAG (THE LEFT)
21
Big data: a challenge to privacy, a threat to society, an opportunity. Should we trust businesses with our privacy online or look to the state for protection? JAN MALINOWSKI1, COUNCIL OF EUROPE, HEAD OF INFORMATION SOCIETY
1 Disclaimer: the views expressed here are only those of the author and should in no way be regarded as representing those of the Council of Europe or any of its organs, bodies or services. 22
RESPONSES GOVERNMENT & PARLIAMENT
Discussions on big data mostly focus on the challenges that it poses to privacy, the opportunities it offers business and the security establishment, and the benefits it offers users: new products and services, improved user experience, promise of better healthcare, new forms of participation in decision making and democracy. These are serious, interrelated issues where privacy is sometimes minimised by obscuring the relevance of the individual or citizen, presenting persons as mere data subjects. Youth representatives at the Stockholm European Dialogue on Internet Governance (EuroDIG) in June 2012 described their approach to privacy online as “click yes and hope for the best”. Much is justified by saying that users have to pay for pretended free services with their personal data. This is mostly about commercial use of data, targeted advertising, data sharing among commercial partners or other transactions involving personal data, or even metadata aggregation and research. While most people saw at most the tip of this iceberg, Edward Snowden jolted everyone by revealing the obscure space occupied by largely unaccountable spooks and their high-profit contractors, with an insatiable appetite for personal data and metadata. As the discussion matures with the help of Snowden and others, the message of certain commentators evolves. For example, from advocating the exploitation of data gold mines in the hands of governments to demanding more respect for privacy; or from requiring overbroad data retention to becoming the liberators of users from pervasive data kettling; or in one stroke reversing the safe harbour discourse to unsafe. Neither risks nor opportunities stop there. Compare it to climate change. It can be described in dispassionate terms as a (moderate) temperature rise over time, which can reasonably be expected to have some effect on the environment or even on the availability of energy resources. For some, the environmental concern lies even in the sustainability of the economy and development. But more compelling still, climate change can (apparently) also be described as killing 300,000 people every year; or the disappearance of Arctic ice by 2030; or 2.5 billion people dying because of it by 2054. Can we describe big data by reference to its impact in equally alarming terms? Can one say, for example: the exploitation of big data serves today to shape your consumption; it can reveal your whereabouts at all times, your intimate conduct, preferences, feelings or even your thoughts; tomorrow it will determine your health decisions; and in the longer term it will serve to shape your
political choices and, by aggregation, those of your community? Many fear that we already live in a version of George Orwell’s 1984 world.
Compare it to climate change. Big data is not just there; it does not grow on trees, flow in streams, nor is it found in natural reservoirs, and metadata is not the air around all that. Bruce Schneier described data as the pollution problem of the information age: all computer processes produce it and it stays around. Arguably, personal data is a natural occurrence, a by-product of any human activity, for example, when someone sees and remembers another person in a particular location. But there has been a quantum leap in the latency of data, including metadata, the ability to record, store and aggregate it, and in the capacity to process it. And different data sets can now be combined and exploited together. It is worth keeping Melvin Kranzberg‘s first law of technology in mind: “Technology is neither good nor bad; nor is it neutral.” Increased technical capacity comes with greater responsibility. This principle of increased responsibility was clearly established by the European Court of Human Rights in S and Marper v. the United Kingdom (concerning the overbroad storage of DNA samples by law enforcement agencies; technically possible but a disproportionate intrusion into people’s privacy). Few saw in Marper (December 2008) an announcement of the invalidation by the Court of Justice of the European Union of the data retention directive (April 2014). Will we be able to tame big data, data ubiquity, the syndication of data sets, dragnet, big data enabled mass surveillance, uncontrolled or unaccountable access and predictive analyses, and prevent the misuse of big data to shape public opinion and steer political processes? Increasing control in society with the objective of minimising risk, of criminal activity for instance, has very tangible opportunity costs, especially in terms of fundamental freedoms or civil liberties. A lowest common denominator approach would play into the hands of new age despots, whether government or corporate, leading sheepish masses into an electronic Gulag. Equally unacceptable, relativism could bring to an end the universality, integrity and openness of the Internet. Recognition is owed to a number of individuals for their legacy to the world, such as Tim Berners-Lee, Jon Postel, Vint Cerf, Stuart Parkin (a precursor of big data), Sergey 23
The Opte Project, Wikimedia Commons | CC BY 2.5 | creativecommons. org/licenses/by/2.5/deed.en
Brin and Larry Page. They preceded corporate activity that later sought to obtain maximum profit from common spaces and tools. Nonetheless, businesses must be given considerable credit for many positive developments. But can the question of big data, a matter so important for humanity as a whole, be entrusted to corporate self-regulation or even to a variety of well-intentioned stakeholders with bitty roles and uncertain accountability, or be left in the hands of market forces? The answer has to be no, at least not without qualification. States are the duty bearers of human rights; they are under negative obligations (e.g. not to interfere with the right to privacy) and also positive obligations (to protect and promote respect for privacy, and to provide an enabling environment for its exercise). These obligations are all the greater given that the rights to privacy and to the protection of personal data are instrumental to the exercise of other fundamental rights, such as freedom of expression and the right to assembly and association, and all are intimately linked to participation in a democratic society. With all their faults and lacunae, in their more advanced forms state accountability and good governance mech24
anisms are unparalleled in the corporate world. More often than not, corporate social responsibility follows rather than anticipates policy and legal constraints, sometimes after the failure of intense lobby activity. Companies also embrace social responsibility in response to scrutiny from fourth estate public watchdogs, namely media and nowadays also civil society organisations. At least in Europe, states can effectively be held to account for their performance against certain of their international law obligations before supranational courts. Their obligations are sometimes scoped out in the case law of those courts and also by their other specific commitments under international law, such as the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (often referred to as Convention 108). Another example is the Convention on Cybercrime (also known as the Budapest Convention) with its procedural safeguards and human rights requirements; it might usefully be recalled that the Budapest Convention seeks the criminalisation of illegal access to and interception of data, and system interference. Both instruments, unique in their kind, are open to worldwide accession in the interest of harmonised standards and effectiveness in the protection of rights.
RESPONSES GOVERNMENT & PARLIAMENT
Where hard international law obligations and accountability do not reach, states’ obligations and commitments are sometimes explained in softer legal instruments. Examples can be found in Council of Europe adopted standards on the protection of individuals with regard to the automatic processing of personal data in the context of profiling, and on the protection of human rights with regard to search engines or social networks. There are also common standards, anchored solidly in Convention 108, on personal data processing and the police or employment, and attention has also been paid to the protection of sensitive data, such as financial, medical or other very personal or intimate information.
Where does that leave the possibility of taking advantage of big data for positive use in the common interest? Melvin Kranzberg‘s fourth law of technology comes in handy: “Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology–policy decisions.” That is to say, it doesn’t have to be accepted just because it is technically possible; public policy considerations and public interest imperatives may well limit, condition or shape the deployment or use of technology. But public interest must be understood in its true dimension, not in an opportunistic manner. That is why human rights must be an overriding consideration, underpinned by rule of law (or due process requirements), all against a backdrop of democracy and of good governance with its fundamental multistakeholder dimension. The declared purpose of the now invalidated data retention directive was to enable the prevention, investigation, detection and prosecution of serious crime, in particular organised crime and terrorism. It commanded the building of data stockpiles in Europe. While its invalidation may not signal the beginning of the end of big data, a dispassionate reading of the judgment of the Court of Justice of the European Union (Joined Cases C-293/12 and C-594/12) foretells a significant impact on data processing, which in data protection terms encompasses the collection, preservation and exploitation of data – whatever the purpose. Paragraphs 26 to 28 of the judgment read: „… the data which providers of publicly available electronic communications services or of public communications networks must retain, pursuant to Articles 3 and 5 of Directive 2006/24, include data necessary to trace and
identify the source of a communication and its destination, to identify the date, time, duration and type of a communication, to identify users’ communication equipment, and to identify the location of mobile communication equipment, data which consist, inter alia, of the name and address of the subscriber or registered user, the calling telephone number, the number called and an IP address for Internet services. Those data make it possible, in particular, to know the identity of the person with whom a subscriber or registered user has communicated and by what means, and to identify the time of the communication as well as the place from which that communication took place. They also make it possible to know the frequency of the communications of the subscriber or registered user with certain persons during a given period.“ „Those data, taken as a whole, may allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environments frequented by them.“ „… even though, as is apparent from Article 1(2) and Article 5(2) of Directive 2006/24, the directive does not permit the retention of the content of the communication or of information consulted using an electronic communications network, it is not inconceivable that the retention of the data in question might have an effect on the use, by subscribers or registered users, of the means of communication covered by that directive and, consequently, on their exercise of the freedom of expression guaranteed by Article 11 of the Charter.“ And paragraphs 34 and 59 add: „… the obligation imposed by Articles 3 and 6 of Directive 2006/24 on providers of publicly available electronic communications services or of public communications networks to retain, for a certain period, data relating to a person’s private life and to his communications, such as those referred to in Article 5 of the directive, constitutes in itself an interference with the rights guaranteed by Article 7 of the Charter.“ „… whilst seeking to contribute to the fight against serious crime, Directive 2006/24 does not require any relationship between the data whose retention is provided for and a threat to public security and, in particular, it is not restricted to a retention in relation (i) to data pertaining to a particular time period and/or a particular geographical zone and/or to a circle of particular persons likely to be involved, in one way or another, in a serious crime, or 25
(ii) to persons who could, for other reasons, contribute, by the retention of their data, to the prevention, detection or prosecution of serious offences.“ The judgment explains that, in addition, the directive failed to put in place adequate safeguards as regards “access of the competent national authorities to the data and to their subsequent use” or against “the risk of unlawful access to that data” and for “the protection and security of the data in question in a clear and strict manner in order to ensure their full integrity and confidentiality.” As regards national security, the Council of Europe ministers responsible for media and information society stated in November 2013 that: “Any data collection or surveillance for the purpose of protection of national security must be done in compliance with existing human rights and rule of law requirements, including Article 8 of the European Convention on Human Rights. Given the growing technological capabilities for electronic mass surveillance and the resulting concerns, we emphasise that there must be adequate and effective guarantees against abuse which may undermine or even destroy democracy.”
States are the duty bearers of human rights. If big and indiscriminate data retention is not permitted for law enforcement or crime prevention and “a system of secret surveillance for the protection of national security may undermine or even destroy democracy under the cloak of defending it” (in Weber and Sarabia v. Germany, 2006), where does it leave big data for commercial or other exploitation? Other cases pending before the European Court of Human Rights: Big Brother Watch et al v. the United Kingdom (GCHQ ), and Centrum för Rättvisa v. Sweden (FRA) may offer additional guidance. It may be unsafe to rely solely on the argument of optimisation of user experience, or even the consent of users, given the dominant position of the entities seeking consent and the relative lack of choice of those giving it. Equally risky would be to argue the anonymisation of personal data incorporated into big data sets, which experts say is impossible given the ability to correlate data to form a precise picture of the individual behind the different elements. 26
Where does that leave the possibility of taking advantage of big data for positive use in the common interest? The UN itself is using or exploring the use of big data for development and for humanitarian action, to combat hunger, poverty or disease, to provide relief where it is most needed and to allocate scarce resources in the most efficient way. Big reliable data, produced in an unbiased and often unnoticed manner, as opposed to outdated statistics, can help in many ways. The OECD also sees in big data a source of growth and innovation, or knowledge based capital, with considerable potential for scientific, medical research or even for public governance. Are there pitfalls or do we trust that by giving a positive label to the exploitation of big data, for example for public governance or for providing relief to people in a post-conflict situation, there is a guarantee that it will not be open to misuse, that data will be secure, that anonymity will be preserved? In the absence of certainty, a precautionary approach is justified, taking every measure possible to avoid unintended consequences: the burden falls to those taking the action. Are all answers to these questions case or circumstanceneutral or, for example are the identities of the data controller and of those employed by the data controller relevant? The controller’s arrangements for storing, securing and processing personal data are most relevant, as underlined in the abovementioned judgment of the Court of Justice of the European Union. It is easy to imagine and depict situations that go wrong, and be accused of scaremongering. After all, the data controller vows to do no evil, so what do we have to fear? And the national security services tell us that they are out to catch the terrorists, not to engage in industrial espionage or to capture private images of people who are not suspected of any wrongdoing. The basic rules remain valid and many countries have undertaken to respect them and to enforce them within their jurisdiction: necessity and proportionality, data minimisation, retention only when needed, deletion of data or removal from data sets upon request or as soon as no longer needed. Convention 108, with its modernisation process under way, is a common standard that can serve the Internet community as a whole. Effectiveness will require more accessions and strong follow-up arrangements, possibly extending to monitoring compliance not only by state but also non-state actors, and including the transborder dimension of data processing: data flows, storage and security, access, exploitation. There will be a need for further careful examination of human rights, rule of law and democracy considerations to strike the right balance in respect of big data for years to come.
Thomas Hawk | https://flic.kr/p/7FVN3j | CC BY-NC 2.0 | https://creativecommons.org/licenses/by-nc/2.0/deed.de
RESPONSES GOVERNMENT & PARLIAMENT
Switching Off the Age of Enlightenment? GEORG APENES, FORMER NORWEGIAN DATA INSPECTORATE
Norway is celebrating 200 years of having a written constitution. During the spring of 1814, a handful of presumably representative members of the general public gathered. After the US and France, Norway was historically one of the first nations in which individual citizens enjoyed full freedom and the sovereign power was regulated by law. But to be honest, most people were not particularly concerned with human rights 200 years ago. They went
about their everyday tasks in the fjords, in the barns and in the fields. Even though small libraries popped up regionally in Norway in the late 17th century, we have no indications that Norwegian peasants asked for literature on human rights in general or on the safeguarding of the individual right of self-determination by means of a constitution. Statistics are very rare in this field, but as far as I know ‘An Essay Concerning Human Understanding’ by John Locke, published in 1690, or the ‘Critique of Practical Reason’ by Immanuel
27
Kant, published about one hundred years later, were not general reading. Still, they laid the foundation for the age of European thought known as the Age of Reason or the Enlightenment. Giving philosophical impetus to the development of political institutions in Europe? Yes. Generating enthusiasm amongst the general public – the individuals who, little by little, should be given political power and become ‘voters’? No. In 2013, Edward Snowden ‘happened’. Half the IT community was shocked. The other half commented that this was nothing more than to be expected. As they said: sooner or later our globe will be completely transparent. Once again, we will try to think up systems or methods that may reserve a niche of privacy for a nation, a region or an alliance of nations sharing common values in their legislation or having signed the same treaty documents, but we very much doubt it is possible – after the Internet, multiple surveillance devices are omnipresent.
As they said: sooner or later our globe will be completely transparent. It was my own impression that these recommendations were delivered rather half-heartedly. The critics plainly pointed to the fact that the world before the Snowden incident certainly didn’t lack internationally binding, legal documents. In short, US president Barack Obama knew what he was doing, being familiar with European legislation in this area. However, he decided not to respect it. That was what made Angela Merkel so mad! And at the present stage of technological development, it is next to impossible to detect who is listening in, and where, and when. One day this may, however, be possible and we may expect enormous resources to be invested in efforts to make eavesdropping virtually impossible… And this in itself will provoke still more efforts – to reconquer lost territories of accessibility! Meanwhile, we may consider what will happen when we as a nation discover that information collected from one individual may be of legitimate interest to other individuals, groups, researchers or governmental agencies.
28
To illustrate my point, I refer to what has happened in medicine in the last 20–30 years, where it is common to argue that giving up privacy or the protection of even very sensitive personal data may help others. It may also be argued that, for instance, insurance companies and social planners in health, schools and housing will find personal information interesting in their work.
We do not have the tools. Thus it is my humble opinion that we do not have the tools – at least not today – to protect individual privacy while at the same time giving personal information to projects and plans that are generally accepted as being constructed for the common good. Defence being one example, medicine being another. In the US, there is an old saying that goes: “You can’t have your cake and eat it”. This is why nobody, as far as I can see, has yet come up with a scheme that allows both. The situation in 2014 is, I think, that we may have to choose: if it is possible to develop systems and technologies that may promote ‘classical’ privacy – which I myself very much doubt – let us still give it a try. If it is not possible, however, then let us note that George Orwell might well have been right in 1948 when he wrote in his novel 1984: “Who controls the past controls the future”.
RESPONSES GOVERNMENT & PARLIAMENT
Big Data and Big Government necessitate a paradigm shift PETRA SITTE, MEMBER OF THE GERMAN BUNDESTAG (THE LEFT)
In his book Digital Disconnect, Robert W. McChesney sees an increasingly symbiotic relationship between Big Data and Big Government, describing it as “a marriage made in heaven, with dire implications for liberty and democracy” (McChesney, p.21). He explains how a military–digital complex has been developed, made up of the government, the military and the secret services on the one hand and the Internet giants from the digital economy on the other. According to McChesney, they have a complementary and mutually beneficial relationship: the government benefits from being given access to technologies and data by the firms; the firms benefit not only from receiving large services contracts from the government, but also because they can rest assured that the government will not restrict their activities through anti-trust, taxation or regulatory measures and will represent their interests across the world.
cept based on the idea of collecting as much data as possible – the more data that is collected, the better the concept works”.
Big Data makes improved prevention, earlier protective measures and more targeted treatment possible.
Though only a fraction of the Snowden documents has so far been published and we currently have to assume that Internet firms are generally forced by national laws to collaborate with the security apparatus in the states concerned, the existence of a convergence of interests cannot be dismissed completely. This convergence of interests stems from the automated processing of large amounts of data for the purposes of analysis, prediction and accessing of information on individual behaviour.
In some areas of science this may even have positive effects. In medicine, for example, Big Data analytics is already bringing practical benefits in fighting infectious epidemics. Big Data makes improved prevention, earlier protective measures and more targeted treatment possible. Yet the knowledge gained through Big Data appears relatively superficial in comparison with scientists’ traditional quest for knowledge. In a talk given in Berlin recently, Viktor Mayer-Schönburger from the Oxford Internet School pointed out that Big Data allows correlations to be identified, but is rarely able to provide information on causality (see Sievers in Neues Deutschland, 22 April 2014). Yet finding answers to the question “why” remains the duty of critical science and it is also the very essence of enlightenment, emancipation, innovation and progressive policymaking. Thus, the limits of Big Data also highlight a trend towards a totality of existing facts and make clear that the use of Big Data must be well thought through and must be subject to strong human oversight, i.e. oversight by society and policymakers. It is particularly important in this context to ensure a sensitive technology impact assessment, leading to legal provisions focusing on individual civil liberties and human rights.
Peter Schaar aptly defines the underlying concept as follows in his proposition: “big data represents a con-
Against this background, the words of Eric Schmidt, now Google’s Executive Chairman, give pause for
The grim picture painted in 2013 by McChesney, who is Professor of Communication at the University of Illinois, was published prior to the far-reaching revelations made by Edward Snowden. Since then, acronyms and codes such as PRISM, Tempora and XKeyScore have become synonymous with the covert mass surveillance of the population on a previously unknown scale.
29
Credit: TechCrunch | https://flic.kr/p/8EMbwv | CC BY 2.0 https://creativecommons.org/licenses/by/2.0/deed.de
thought. Speaking as Google CEO at a technology conference in August 2010, he suggested that the challenges of modern technology could only be tackled through “much greater transparency and no anonymity”. And he added that, in a world of asymmetric threats, “true anonymity is too dangerous” (see Fried in CNET, 4 August 2010).
We need a European area of data protection Schmidt’s words described a vision of society in which the elimination of all kinds of anonymity on the Internet is viewed as an article of faith. What he omitted to say was that in a world of mobile communication, with robots, drones, intelligent household appliances and many other technologies, information about people‘s current location is also generated. He also did not mention that the technologies needed for biometric body scans and analysis of humans have already existed for some time – whether they are used for voice or face recognition, fingerprints or handprints. It is already possible for the algorithm-based analysis of large amounts of data 30
gained through smart devices to be combined with the logging of physical characteristics such as somebody‘s way of walking and moving or their speech, heartbeat or breathing patterns. The data processing industry dreams of complete transparency in order to allow the profitable marketing of individual patterns of behaviour through advertising, insurance services, transport guidance systems and other process management systems. At the same time, this transparency presents opportunities on an even greater scale for the security state whose existence has become particularly evident since the Snowden leaks. The security state would gain opportunities to predict (and thus observe and sanction) human behaviour with, at the very least, Orwellian dimensions. Thus, not only are a large number of good detailed regulations needed, as proposed by Peter Schaar, but also no less than a paradigm shift in data protection policy. We need a European area of data protection which is worthy of its name and which sets limits on the dreams of generating profits through Big Data at the expense of personal autonomy and identity, as well as on Big Government‘s false promises of security.
RESPONSES GOVERNMENT & PARLIAMENT
Yet a European data protection area must not lead to re-territorialisation or balkanisation of the Internet, as Peter Schaar writes. A European Schengen routing system, for example, would mean the end of the fundamental principle of global connectivity. This would make it difficult to reject calls like those already made in the framework of the European Commission‘s Schengen deliberations during the Hungarian Council presidency for checks on content entering the area at external borders. Instead, we must ensure an equally high level of protection both within and outside the European Union. And in the context of amendments to European data protection law, we must ensure legal security concerning data processing by international companies, and that the right to informational selfdetermination is taken into account. One important step in this direction is to make the principles of “privacy by design” and “privacy by default” binding in Europe. Schaar only mentions this briefly in his article, yet it is worthwhile explaining this by means of concrete examples. The privacy by design principle would make certain functionalities obligatory, such as the default encryption of data, data deletion after the performance of a function and technical measures ensuring purpose limitation. The privacy by default principle would mean that the strictest possible data protection settings would apply as soon as people began to use electronic services and applications. Thus web services, smartphones, tablets and apps would not be able to pass on data on usage, contacts and location and accumulate it on server farms without users giving their consent. In addition, the establishment of a European area of data protection will undoubtedly require the exertion of political and economic pressure. The means to achieve this are there. One important element is the renegotiation and, if necessary, termination of the Swift and Passenger Name Record agreements, along with the Safe Harbour Agreement, which has proved ineffective in practice. In addition, an initiative is needed to create a European open-source infrastructure developed by a large number of small and medium-sized companies with public financial support and with standards developed openly, publicly and transparently. This is the only way to create a trustworthy European counterweight to the dominance of large American Internet firms.
(BND) and other Western external intelligence services are also involved in this total surveillance. They are all responsible for surveillance of communications outside their own countries or between their own countries and others. Surveillance of communication within a service’s own country is largely taboo and the legal restrictions in place are similar in all cases. These services cooperate via a system of information swaps. They receive information on communications within their own country in exchange for information they have gathered on communications in another country. As surveillance of communications abroad is not subject to any restrictions and thus not to any oversight, the system always conforms with the national legal framework – at least from the perspective of the secret services and the governments which back them. Were it possible to end this system of mutual favours, at least in Europe, by ensuring that all EU citizens were treated as domestic residents, one of the first important elements would be removed from the worldwide system of mass surveillance. This would lend the EU a much stronger position in negotiations with the US Administration on dismantling the surveillance system. Unless this happens, the impacts of Big Data and Big Government will indeed be fatal.
LIST OF REFERENCES: Fried, Ina. “Google‘s Schmidt: Society not ready for technology.” CNET. 4 August 2010. http://www.cnet.com/ news/googles-schmidt-society-not-ready-for-technology/ McChesney, Robert W. Digital Disconnect. New York: The New Press, 2013 Sievers, Uwe. “Verfolgt vom eigenen Datenschatten.” Neues Deutschland 22 April 2014. http:// www.neues-deutschland.de/artikel/930683.verfolgt-vom-eigenen-datenschatten.html
Finally, I should point out that the mass surveillance of Internet communications is not only being carried out by the US National Security Agency (NSA) and the British Government Communications Headquarters (GCHQ ). The French Direction Générale de la Sécurité Extérieure (DGSE), the German Federal Intelligence Service 31
RESPONSES PRIVATE SECTOR
Responses Private Sector NICK ASHTON-HART, COMPUTER & COMMUNICATIONS INDUSTRY ASSOCIATION, GENEVA SUSANNE DEHMEL, BITKOM GERMANY GEORGE SALAMA, SAMENA TELECOMMUNICATIONS COUNCIL, DUBAI
Credit: Le.Sanchez | https://flic.kr/ p/asAHVk Le.Sanchez | https://creativecommons.org/licenses/by-sa/2.0/
34
RESPONSES PRIVATE SECTOR
The Internet is not incompatible with data protection, but the debate we currently have about privacy largely is NICK ASHTON-HART, COMPUTER & COMMUNICATIONS INDUSTRY ASSOCIATION, GENEVA
SUMMARY Peter Schaar’s article „The Internet and Big Data – Incompatible with Data Protection?“ is an excellent tour d’horizon of the debate we currently have, especially in Europe, about data protection. It also shows how incomplete that debate is, and the false assumptions at the core of why, in three substantial ways: 1. The use by governments of personally identifiable information (PII), especially for national security uses but also much more broadly, is fundamentally different from use by the private sector of that information. Right now, there is very little discussion of how fundamental privacy protections, even in the EU, do not apply to large swathes of governments’ activities – governments who are often the first to complain about economic actors’ uses of the very same information. More profoundly, the use (and really abuse) of the Internet as a tool for data acquisition and analysis for national security is not a data protection, nor an Internet, problem: countries see foreigners as ‘fair game’ for data collection with no legal inhibitions at all. Mutual legal assistance treaty (MLAT)1 1 For an understanding of MLATs and why reform is important, see Access’ MLAT website at www.mlat.info.
reform (amongst other activities) is the venue for solving the real problem here, not data protection laws. 2. All economic use of data is not the same. So-called data brokers2 (and more generally business-to-business (B2B) business models) have a strong economic interest in weak privacy protections as they need to aggregate the maximum amount of personal information with maximum freedom to exploit it. The opposite is true of business-to-consumer (B2C) services, who need users to trust them lest these users switch to another competing service with better policies. The current debate largely lumps everyone together – which, ironically, serves the interests of the unscrupulous more than it does the responsible, by demonising everyone. 3. Competition law – itself based upon 20th century, or even 19th century, concepts – is very poorly suited to solving 21st century data protection issues in creating consumer choice through its application. This is a fundamental misunderstanding of competition law in the Internet age, and of the conditions that create innovation in technology-based services. Even a cursory look 2 There are many good sources of information about the activities of data brokers. A good snapshot, if rather US-centric, of the many issues with them can be found on EPIC’s website at https://epic.org/privacy/choicepoint/. 35
at how competition law has been applied, especially in Europe, in tech sector services shows it has done little more than create a billing bonanza for specialist lawyers on the one hand, and an ability for government competition authorities to ‘look busy’ on the other.^
THE DIFFERENCE BETWEEN GOVERNMENTAL AND ECONOMIC USE OF PERSONAL DATA: NIGHT AND DAY Put bluntly, governments can lock people up and throw away the key when it comes to national security – including indefinite detention without trial. Companies cannot. Companies can monetize personal data and, in the case of consumer-facing free-to-the-user services, target advertising. Abuses can and have resulted in lost jobs and in the release of very personal information – sexual orientation, pregnancy, and many other situations – and these deserve real focus. These abuses should not be allowed to obscure the truth that there is a vast power disparity between companies and governments, and there are consequences as a result of abuse of personal information by each. To make matters worse, while the public debate about reform of data protection laws focuses almost exclusively on reforms to protect users from the actions of the private sector, little to no debate is taking place on the very large carve-outs in existing data protection legislation that allow governments to use personal information for all sorts of purposes with little to no disclosure, or any obligation to ask citizens for their consent, for any of it. This is particularly striking given the fact that there are spectacular abuses of PII by governments and not just in national security realms: large amounts of PII have been made public through inadequate systemic security, plain carelessness, and even greed3. As recently as this past February, the UK’s National Health Service sold the records of 47 million UK citizens4 to the insurance industry. Many governments are deliberately fostering the conflation of these two entirely different realities. There is a deliberate campaign by Western countries, especially those who are part of the Five Eyes surveillance treaty system, to conflate them in international discussions in Geneva. Representatives of these countries routinely object to any attempt to differentiate between the two
in various multilateral fora in Geneva. I have personally witnessed this and when I have done so, in attempting to highlight the differences, these countries then intervene to object5. It could be argued that US President Obama’s speech on surveillance on 17th January 2014 in part implied a conflation of these two concepts6.
We deserve better from national leaders and governments than cynical attempts. To say that efforts such as this plumb the depths of politically-convenient expediency is an understatement – particularly for countries that tout themselves as paragons of freedom and democracy. We deserve better from national leaders and governments than cynical attempts at deflection of responsibility given the clear history of their abusive behaviour – and these same governments’ sustained, and successful, campaigns to give themselves much more freedom to use our PII than they give to anyone else. It is disappointing that some of the most active advocates for privacy protections in civil society fail to call governments on this asymmetry and simply demonize the private sector and let governments largely off the hook. Every stakeholder that comes into contact with PII should be held responsible for their use of it, and the accountability expected of each should relate to their ability to do harm when they abuse it. Right now, the opposite is largely true.
ALL ECONOMIC USE OF DATA IS NOT THE SAME Schaar’s article dwells at some length on the responsibility of economic actors to protect PII – without once making clear that one size does not fit all. Moreover, his article does not address a fundamental question which should be asked if we want to understand the economic 5 This happened most recently on 14–17 April at a meeting of the Multistakeholder Preparatory Process for the ITU’s High Level Event (to be held this June), during a discussion of how or whether to address mass surveillance in the documents being negotiated. 6 The speech text in full may be found at http://www.whitehouse. intelligence. The paragraph in particular that relates to this point reads in part “Corporations of all shapes and sizes track what you buy, store and analyze our data, and use it for commercial purposes; that’s how those targeted ads pop up on your computer and your smartphone periodically.” The announcement in the same speech of a high-level panel to look into personal data could easily be seen as
3 For just a few examples, and in just one country, this Wikipedia article is salutary: https://en.wikipedia.org/wiki/List_of_UK_government_data_losses. Even more egregious is the case of the UK’s National Health Service selling the data of UK citizens. 4 For mainstream reporting of this episode and its aftermath see http://www.telegraph.co.uk/health/nhs/10659147/Patientrecords-should-not-have-been-sold-NHS-admits.html.
on 1st May, largely ignores government use of personal information. 36
RESPONSES PRIVATE SECTOR
use of PII: what motivates different sectors to acquire PII, and what motivations do they have in using it?
Deciding that routes are ‘secure’ means instructing the network to route packets not based upon efficiency, but upon perceptions of security. Unsurprisingly, different parts of industry that possess PII have starkly different motivations – in some cases, these motivations are diametrically opposite. A case in point are so-called data brokers: companies whose whole business model is to aggregate as much information about each person as they can find and sell it on to third parties as many times as possible7. Contrast that with the other end of the scale: companies who provide services for free at the point of use and make money through advertising. Here, the motivation is to foster consumer trust lest these consumers leave for other services.
COMPLAINTS ABOUT PRIVACY POLICIES MISS THE BIG PICTURE Schaar dismisses privacy policies with the statement “Extensive privacy policies which are as long as a novel, and which nobody reads, provide pretended transparency only”. This is facile and shallow. Why? R5 It dismisses the fact that privacy policies exist to provide the individual with guarantees from the company they have provided their data to, and in return limit the company hosting the service from liability if they follow the rules they have promised to. R5 The more detailed the privacy policies are, the more terms there are to enforce when companies violate them – the more detailed the privacy policies are, the more data protection authorities have to work with in cases of breach. R5 While some companies may be motivated to hide abuse through obfuscation (the ‘data broker phenomenon’), long policies are a consequence of the fact that services are global and privacy laws national. Creating user contracts that are capable of multinational application guarantees complexity.
7 For an overview of data broker practices in the USA, a 15-minute segment on popular news magazine show 60 Minutes entitled “The Data Brokers: Selling Your Personal Information” is worth watching. Find it here: http://www.cbsnews.com/ videos/the-data-brokers-selling-your-personal-information/.
His points on the inadequate penalties available to data protection officials are well made. With the different motivations that exist, bad actors do need to be deterred by remedies that will actually be commercially painful, and the caps on fines in most jurisdictions are unlikely to be painful except to the SME sector, perhaps. Competition law is not a panacea When the pace of technological change is as rapid as it is, the idea that competition law is the solution to online problems – with its traditional multi-year timelines and vast costs to all parties – is hard to credit. Here are a few issues with Schaar’s arguments: R5 Schaar suggests that social media services do not allow portability of data between services, when the most popular do and have for years. R5 He states that “…monopolistic structures restrict[ing] the user’s autonomy in Web 2.0 services…” when there are dozens of search engines, social media platforms, and the like and they are free at the point of use – and all competition cases we have seen so far have not related to restrictions on users, but rather to relative treatment of competing services. The argument seems to suggest we penalise popular services for being popular. R5 He suggests that “In order to prevent discrimination, a ‘right to connectivity’ should be considered as well”. It is unclear what services he is addressing here given that a business model that seeks to prevent users from accessing services would be hard to sustain. On the other hand, competition between firms on privacy policies is another thing entirely, and it is unfortunate that Schaar does not mention this: when you have multiple services in the same space (social media, search, etc.) then users can actually look at privacy policies to see which services have policies they like the best.
RESPONDING TO SURVEILLANCE Schaar rightly identifies that the current situation with respect to data gathering by governments for national security services is damaging to economic development and corrosive of the foundations of democratic values. Unfortunately, his proposed fixes, with one exception, are likely to make things worse rather than better. Firstly, the idea that a global agreement on data protection standards is desirable is extremely problematic: privacy protection legislation is under review in many countries worldwide and these regimes are likely to remain in flux for some time. Creating binding international norms in such a situation is at best a very slow and 37
politically fraught undertaking. Secondly, the suggestion that “secure routing of data packets” is desirable is far from true. Internet technologies are designed to allow each data packet to find the most efficient route between two points to maximise performance and optimise efficient utilization of the network. Deciding that routes are ‘secure’ means instructing the network to route packets not based upon efficiency, but upon perceptions of security. If many countries were to do this, it might well impact the stability of the network as a whole and would almost certainly result in a more expensive and slower network for everyone. If that was not bad enough, there is no ‘secure’ route vis-à-vis surveillance; security services globally are gathering traffic from far beyond their borders. The idea which is posited in public by many countries that ‘routing around’ certain countries protects their citizens is badly misinformed. Schaar is correct that encryption will frustrate mass surveillance systems’ ability to directly access communications ‘on the wire’ without going through due process – and it can, depending upon the type of communication, frustrate access to the contents of communications demanded of online services through due process. All of that is treating the symptom rather than the core problem.
38
If you want to really resolve issues with data protection in a national security context, then reform of MLATs may provide the best option at a practical level, since that could allow a balancing of interests of security, law enforcement, human rights, and transparency that would also restore trust. There is something of value for reform for those responsible for national security, too: if everything is encrypted, accessing the tiny proportion of electronic communications that really is necessary for them to do their job will become very difficult. Truly interoperable MLATs could allow socially acceptable access across multiple borders to the communications of real terrorists and criminals more quickly and efficiently than is presently possible. The present concept of security services seems to lead to covering the earth with secret data centers in a vain attempt to store and then sift through all electronic communications by everyone by each country willing to spend the fabulous sums involved in an Orwellian surveillance arms race. This is not sustainable from any perspective.
RESPONSES PRIVATE SECTOR
Credit: r2hox | https://flic.kr/p/gdMrKi CC BY SA. 2.0 | https://creativecommons.org/licenses/ by-sa/2.0/
Modernizing data protection along with data processing technologies SUSANNE DEHMEL, BITKOM GERMANY
In his article, Peter Schaar raises a great many questions that keep coming up with regard to the use of the Internet and big data, and generally in relation to the development of our digital society. The digitalization of our lives leads to both new dimensions in the amount of data processed and also new dimensions in our ability to process and access data. We need new methods of processing in order to cope with the existing and ever growing amounts of data we produce. This efficient way of processing data doesn’t only help us to keep our famil-
iar analysis processes going, but, fascinatingly, also opens up completely new forms of analyses that enable us to implement formerly unthinkable scientific and economic applications. This means we have to re-balance interests. Therefore, we struggle with the question of whether we are able to incorporate new technologies into existing categories of data protection laws without losing too much of the scientific and economic potential of these technologies on the one hand and legal clarity on the other – or whether we need to rethink some data protec39
tion laws and principles in order to preserve or transfer the effective protection of our private sphere and our freedom of action in the future.
We should be aware of the fact that the time and effort needed to achieve pseudonymization and anonymization must be feasible and adequate. Not only does Schaar name a number of questions that arise; he also outlines some of the answers to them. A very important one is the setting of incentives for processing data in anonymized and pseudonymized form in order to keep the intensity of interference with basic rights and the risk of misuse as low as possible. The option of anonymized data processing is also significant for big data applications, as consent and purpose limitation can constitute barriers for applications that were not foreseeable at the moment of data collection or where there is no possibility of getting consent from the data subject. The note on the German Telemedia act (Telemediengesetz) and its definitions of “anonymous” and “pseudonymous” is helpful with a view to the ongoing consultations on the EU data protection regulation. It seems that there is no common understanding of these notions between member states; yet it would be very helpful to agree on the definition of these terms, also with regard to third countries. But if we ask for the increased use of anonymized and pseudonymized data as a means of risk limitation, we should be aware of the fact that the time and effort needed to achieve pseudonymization and anonymization must be feasible and adequate in relation to the risks. At the same time, it is also quite clear that with growing amounts of data about us available, it becomes significantly more difficult to anonymize data in a way that cannot be reversed by anyone anywhere, now and in the future. Incentives need to be offered for companies to set up safe surroundings – advanced anonymization technologies together with organizational measures. These incentives must be strong enough to motivate companies to undertake this effort. It must be possible to get out of the limitations of data protection law when any links that related information to a person were removed from the information. It should also be possible to handle pseudonymized data flexibly when there is no indication of undue negative effects on the data subjects’ interests. Therefore, we definitely need an international concept with a relative definition of anonymity as we already have in German law, and it should be enhanced with provisions for privileged handling of pseudonymized data. 40
Schaar also names a procedural instrument that should help to balance risks and interests of data processing within companies or other responsible organizations: privacy impact assessments (PIAs). From an industry perspective, this could become useful as we know similar instruments function well in other areas, such as compliance departments. In order to introduce, or rather extend, impact assessments concerning privacy, a legal obligation to use them might be helpful. But the law should not be too detailed on how these PIAs are to be conducted. There needs to be flexibility with regard to the detail and depth of a risk assessment, depending on the nature and context of the data processing process or product and the privacy risks that can be expected. Best practices should be collected. Industry could work together with stakeholders who represent the interests of data subjects to develop standards for characteristic procedures and contexts in a process of self-regulation or co-regulation. Such risk assessments can help to realize the concept of “privacy by design” in all privacy-critical processes and products and combinations of them. This concept is important for an effective realization of data protection in an economic way. The goal is to keep data protection in mind from the start and also while developing new products or services. Thus it is possible to implement measures that keep privacy risks to a minimum and/or give consumers and companies the opportunity to consciously decide upon the use of their data with regard to a certain service or use of a product. The concept of privacy by design is, in my view, more important and more helpful than the concept of privacy by default, as the latter only concentrates on the stage of delivering the product or service and automatically puts data protection at the top of the consumer’s priority list, thus tending to patronize him. Quite often, consumers have to decide between optimum user friendliness /convenience and optimum data protection, as both might not be achievable at the same time. Using social networks as an example, some users want to be found by as many matching contacts as possible because their business might depend on these contacts. But others only want to use the network in order to share information with people they already know. Both groups of users expect to be able to do so in a convenient way without much bother beforehand, but their preferred privacy settings will be different. Privacy by design should mean combining optimum convenience with optimum privacy, if possible – but if this is not possible, consumers should be given the choice of different options with different advantages and disadvantages. Plus, providers must explain to the customer what these advantages and disadvantages are.
RESPONSES PRIVATE SECTOR
The instrument of “data portability” also mentioned only makes sense for services where users administrate and/or display a lot of their own data and where the inability to extract this data in a fairly convenient way would have a prohibitive effect on changing service providers. It is not a data protection issue but rather a competition issue and should be treated as such. An undue extension or application of a so-called right to data portability on other services, such as online shops, or on data controllers in general could unduly burden many of these data controllers and might lead to other competition problems while not bring any advantages in terms of data protection.
Transparency is the keyword to which companies (and governments) should feel obliged and bound. Transparency is the keyword to which companies (and governments) should feel obliged and bound, as it is the basis for a fair relationship with customers and any other person whose data will be processed. Schaar is right in asking for policies that can be consumed in a reasonable time and manner. But to ask for this is still easier than to comply with legal provisions and reduce the information to the relevant bits for the consumer at the same time (who generally cares about data protection, but in many cases still cannot be bothered to go into the technical details of the services he is using). It is a complex task to set up global processes and products to ensure that they comply with all the diverging provisions in different states and with different purposes (security, fraud prevention legislation, civil law, data protection laws, etc.) and at the same time to have unified and transparent processes that can be easily explained to each customer. Nevertheless, companies need to earn and retain the trust of their customers, no matter if these are other companies or consumers. Trust is built up if you feel that your partner and his work are reliable and his actions are foreseeable. Therefore, companies need to work on their transparency in the form of fair communication to and with customers on the basic ways of handling their data. A fpredictable way of handling customer data might become one quality aspect of the products or services companies want to sell.
also be as clear and as transparent as possible. This is a difficult task for each government internally and in relation to foreign partners, as it might mean limiting its own power and information advantage. But a balance of security interests and an interest in the freedom of action of the individual has to be found. Otherwise, governments might not only face democratic problems but also economic deficits on the long run. Calls on the legislator to draw the exact boundaries between the acceptable and unacceptable use of personal data are valid for the actions of private individuals as well as for state authorities. Just as companies need clear rules within which they can act, state authorities and intelligence services need such rules. In both cases, the existence of clear and transparent rules also allows effective enforcement through adequate sanctions. Despite the challenges we are facing, I think Peter Schaar is saying yes, the Internet and big data are compatible with data protection. And again I agree with him. Combining both is feasible if governments and industry make an effort – in defining transparent and fair rules for companies and authorities on the one hand and for data subjects on the other, in developing new technical and organizational measures that fit new data processing techniques, and in keeping up a factual social discourse on how we want to live in our digital world.
But all industry’s efforts to act as trustworthy data controllers might be thwarted if member states of the EU and third countries do not play according to the same rules that they impose on companies. The rules for intelligence services and other authorities that might want to access user data for some kind of security reasons should 41
Big Data: An Opportunity Combined With Privacy Concerns. A Regulatory Perspective
Azrieli Center Ksenia Smirnova | https://flic.kr/p/bcGW4x CC BY 2.0 | https://creativecommons.org/licenses/by/2.0/
GEORGE SALAMA, SAMENA TELECOMMUNICATIONS COUNCIL, DUBAI
42
RESPONSES PRIVATE SECTOR
“Big Data” is a new term that is currently trending within the ICT industry and which is coming into sharper focus, especially due to privacy and security concerns. With the incredible growth of data produced/consumed by Internet users through the different types of social networks and mobile apps, the need for a strong but flexible legal and regulatory framework is increasing and also becoming mandatory. Such regulatory measures should be developed in a smart manner that on the one hand gives incentives to Internet players and telecom operators to explore new revenue streams lying behind Big Data, and at the same time grants the basic Internet user a secure, private cyber space. The first aspect of Big Data regulations, which should be taken into account while formulating a Big Data policy framework, is not to hinder investments by giving operators and Internet service providers the ability to easily explore and tap into new revenue streams, such as the ones resulting from Big Data. In the Arab region, operators are under pressure resulting from strong regulatory interventions, such as high licensing fees, new spectrum costs, hidden taxes, and royalty fees. This massive pressure in terms of CAPEX is accompanied at the same time by strict regulatory obligations to provide high quality services at affordable rates. International roaming flat rates regulations are a clear example that illustrates the extent to which telecom operators are facing political and regulatory pressures, which is negatively affecting their revenue growth. Therefore, the Big Data concept, which is described in the article as “new oil”, should be regulated while bearing it in mind as a new revenue generation opportunity for telecom operators and service providers. If Big Data is the “new oil”, then broadband, which is provided by operators, is the vehicle through which this new oil is consumed. For operators to be able to provide such super speed reliable broadband, the question of their sustainability should be placed first. The role of governments and regulators is crucial in formulating and putting in place a clear set of industry friendly policies; such policies are crucial in identifying the methodology and level of data utilization, data quality, access, and preservation.
The role of governments and regulators is crucial in formulating and putting in place a clear set of industry friendly policies Secondly, and most importantly, is the question of privacy, security, and data protection when it comes to Big Data regulations. The point raised in the article that social networks’ privacy settings are switched off by default reflects a key concern that needs to be revisited when setting policies and regulations for Web 2.0 services. Privacy settings
should be developed and displayed in a very simple way that enables the basic user to adjust his privacy and security preferences in a straightforward manner. Also, the traditional way of displaying the terms and conditions of any new Internet service subscription is another concern that requires simplification and redesign, so instead of having long pages full of text in a tiny font, make a video tutorial available in different languages, for example. Another interesting example is the management of the user’s online assets, personal data, email accounts, and social media profiles after death. Google has recently introduced tools that apply to all Google-run accounts, including Gmail, Google+, YouTube, Picasa and others. Users have the option to delete data after a certain period of time or pass their data on to specified people.
Collected once and used many times is the most efficient methodology in the adoption of the Big Data concept No one can deny that privacy is a cornerstone for setting up any Big Data policy framework, but it is also important not to let such concerns hinder innovation. The opportunity resulting from data mining and analysis across different sectors is creating a deep positive impact in the overall national economy. Government has an important role to play in encouraging Big Data use in fields including health care, education, road safety, weather forecasting, financial reporting, mapping, and macroeconomic forecasting. Collected once and used many times is the most efficient methodology in the adoption of the Big Data concept. This saves time, processing power, and cost; therefore, government and the private sector are to be in alignment and synchronization when it comes to sharing data and information, while maintaining a certain level of privacy and transparency. At the same time, there should be a clear distinction between data collected and processed by government agencies or private sector entities on a national level and those collected as a result of the international bulk of data transfer. It is very well said in the article that: “It is unacceptable that governments and intelligence agencies are abusing the increasing international data transfer for bulk access to the transmitted data”. The question of “trust” when using a new Internet service is under major threat and for this trust to be rebuilt, both technical and policy solutions need to be implemented. Data encryption, secure routing, and IPv.6 adoption are being considered amongst the technical solutions. International agreements, regional cooperation, dispute resolution mechanisms, and commercial settlement processes are examples of public policy considerations. 43
RESPONSES CIVIL SOCIET Y
Responses Civil Society
STEPHANIE PERRIN, NON-COMMERCIAL STAKEHOLDERS GROUP AT ICANN RAFIK DAMMAK, MEMBER OF THE STEERING COMMITTEE, INTERNET RIGHTS & PRINCIPLES COALITION, TOKYO LORENA JAUME-PALASÍ, INTERNET & SOCIETY COLLABORATORY, BERLIN
The Internet and big data – incompatible with data protection? We don’t think so! A civil society perspective STEPHANIE PERRIN, MEMBER OF THE NON-COMMERCIAL STAKEHOLDERS GROUP AT ICANN
46
Credit: BobMical | https://flic.kr/p/jgVkon | CC BY 2.0 | https://creativecommons.org/licenses/by/2.0/
RESPONSES CIVIL SOCIET Y
The subject of “big data” can be a depressing one for privacy and human rights advocates, for many reasons. In the first place, it is not well understood by the public or by civil society. Secondly, the prominence of data analytics in the business plans of many of the biggest global IT players and their customers makes it difficult for civil society to fight these elements since analytics are promoted as driving the Internet economy. The value of big data should not be accepted as a given; analytics still have to prove their societal value. Google Flu Trends, the original big data poster child, was later shown to be a failure. It appears, though, that risk assessment and market sorting using big data is highly addictive. The ability to predict future behavior has entranced humanity since our earliest days and a shiny new tool that promises to reveal more about risk will be a difficult bauble to part with, for both the private and the public sector. We must, however, pay attention to the consequences. Oscar Gandy wrote about this in 1993, predicting “the panoptic sort is an antidemocratic system of control that cannot be transformed because it can serve no purpose other than that for which it was designed – the rationalization and control of human existence” (1993, p.227). Peter Schaar delivers a similar message, grounded in his struggles as a data commissioner, to deal with these new technologies using old legislation and feeble powers. His vision
of a “balkanized Internet” that enables nation states to ignore the promise of freedom that the Internet brings, and engage in state control and censorship, is a sobering one.
Much successful data mining is done with relatively non-sensitive data. That’s better, but it doesn’t eliminate all concerns. Schaar gives us many insights in this article as to how big data challenges traditional data protection techniques and principles. He focuses on the role of purpose in data protection, discussing how purpose is less applicable in data analytics, which looks for correlations in data that are unrelated by purpose. Purpose of collection is, nevertheless, still valid as a data protection principle – why would anyone fighting for human rights abandon specific, legitimate purposes for data collection and use? He argues that the privacy community must figure out how to move from the micro perspective, focusing on each individual data element, to a macro perspective, governing the conduct of those who have masses of data and want to punch it around and see what it tells them. In the past, some jurisdictions have relied on consent of the individual to address these issues. Does it still work? If 47
so, why do these different actors have so much data? Did we consent to all that collection? The answer could be yes. But was that collection proportionate? Was the purpose relevant? Was secondary use transparent to the data subject? If we don’t ask these questions, then we face the American Wild West, where anything goes.
Most consumers have no idea about the collection and use of their data. Admittedly, much successful data mining is done with relatively non-sensitive data. That’s better, but it doesn’t eliminate all concerns. After all, if grocery purchases or Internet surfing can predict race, health conditions, or criminal tendencies, we could have the worst of both worlds. Gandy spent much of his career looking at racial discrimination through the lens of information practices. As he points out (2009), discrimination against a group can be accomplished without gathering identifiable information. Dixon and Gellman recently published a report (2014) on consumer scoring with a detailed analysis of what can be done with identifiable and non-identifiable data already held by data brokers and others. The United States government released a report on big data on April 30, 2014, which states: “A significant finding of this report is that big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans’ relationship with data should expand, not diminish, their opportunities and potential” (iii, 51-53, 64-65). This is one of the problems with the new paradigm of “big data”. Anonymity in the cloud of data does not stop the identification of factors useful for individual profiling, and it can mask invidious discrimination. The privacy of groups is something not well covered in current data protection regimes, and several innocuous data elements can now place an individual in a group that has an accurate predictive profile. This goes against the principle of informational self-determination that is at the core of privacy protection, and it may defeat antidiscrimination laws because it can select for race (or other factors) without ever specifying those factors overtly. Gandy returned somewhat more optimistically to this issue in 2009, in Coming to terms with chance: Engaging rational discrimination and cumulative disadvantage. Like Schaar, Gandy is convinced that the time to unite against surveillance and discrimination is now. He has 48
little faith in data protection as a solution; perhaps one can forgive an American scholar for reaching that conclusion. One of his proposals is to set up a body similar to the Environmental Protection Agency, to investigate harm that comes to groups or individuals from data mining and risk profiling. It’s a thought. Civil society as a rule, however, is not going to accept a model of data protection that abandons human rights in exchange for proving harm. Most consumers have no idea about the collection and use of their data. Asking them to prove how the data was used, let alone whether it produced harm, is not a practical or fair response. What to do? Firstly, let me say that civil society in North America and indeed the world still looks to Europe and its data commissioners for leadership in privacy protection. We depend on the Article 29 Working Party to comment on developments and technology, and we look to Germany to defend the right of informational selfdetermination. Never give up! Civil society in Europe has been thoroughly engaged in the debate about the new European data protection regulation and will continue to press for strong rules. It may be, however, that the traditional approach is not enough. If big data is something new (and maybe it isn’t really new), then we need new privacy protections for it. Perhaps civil society can help think out of the box and come up with new concepts of how to control this technology. Those of us engaged in the privacy struggle over many years often reflect ruefully that we have been unsuccessful in stopping tracking via cookies and web beacons, limiting video surveillance, or promoting privacy enhancing technologies (remember PETS?) and making sure that legislation comes with teeth, resources, and fearless data protection commissioners. At this inflection point in Internet governance debates, we have a real opportunity to push for stronger protections for human rights and privacy. If we care to take up the challenge that Schaar has issued – to join together to protect the core values of western democracies – here are a few possible initiatives: 1. Recognize that European civil society has much to discuss with its global partners, including technology activists such as the Electronic Frontier Foundation, TOR, and the Australian Privacy Foundation. We all need to redefine goals for 21st century data protection legislation. Workshops or webinars that get down to details of what to look for – new approaches to the old principles – would be a start. Funding to talk about meeting the challenge of big data would be welcome.
RESPONSES CIVIL SOCIET Y
2. Those engaged in the Internet governance debates need to demand that organizations such as ICANN, the IETF, IANA, and the ancillary registries, meet strong levels of data protection, possibly through binding corporate rules. Technological infrastructures must facilitate privacy protection and anonymity and must not allow the Internet to be the source of unregulated information for undefined big data activities. 3. Support for the existing data protection law and for the Commissioners who struggle to enforce it remains fundamental. Civil society and the commissioners are logical partners and could find more ways to work together without compromising the integrity and independence that each must maintain. 4. Education. Citizens need to understand the technologies of surveillance and how to use different technological tools to protect themselves. Civil society can and does help with this, but it is time to step up the effort.
REFERENCES Dixon, P. & Gellman, R. (2014). The Scoring of America: How Secret Consumer Scores Threaten Your Privacy and Your Future. available at http://www.worldprivacyforum.org/2014/04/ wpf-report-the-scoring-of-america-how-secret-consumer-scores-threaten-your-privacy-and-your-future/. San Diego, CA: World Privacy Forum. Executive office of the President. (2014). Big data: Seizing opportunities, preserving values. http://www. whitehouse.gov/the-press-office/2014/05/01/factsheet-big-data-and-privacy-working-group-review. Executive office of the President: Washington, D.C. Gandy, O. H. (2009). Coming to terms with chance: Engaging rational discrimination and cumulative disadvantage. Farnham, U.K.: Ashgate. Gandy, O.H. (1993). The panoptic sort: A political economy of personal information. Boulder, Colo.: Westview.
5. Maybe Gandy has a good idea, and maybe the Big Data Protection Agency needs to be a part of the global Internet governance structure? The Internet is global and we want to keep it that way. This is a challenge for data protection, particularly if the collection instruments for big data are distributed across applications and platforms. We must start thinking about how human rights could be enforced in data analytics, and abuses remedied. The promise of the Internet as a tool for individual selfdetermination, for growth and development, must not be snuffed out. Privacy is a fundamental human right, and protecting it in the face of technological advances requires optimism, determination, and teamwork. Big data can be corralled, and it will be when more people understand it and unite behind the effort. Peter Schaar’s article points to some of the places where we can make a start.
49
Credit:_Bunn_ | https://flic.kr/p/9i7ZK2 CC BY 2.0 | https://creativecommons.org/licenses/by/2.0/
50
The need for versatility in data protection
RAFIK DAMMAK, MEMBER OF THE STEERING COMMITTEE, INTERNET RIGHTS & PRINCIPLES COALITION, TOKYO
Big data is an evolution bringing new opportunities for businesses, but without a clear benefit for users. It is an example of the evolution of the technological threat against privacy. However, it is still difficult to apprehend it and separate the hype from the reality of this trend. At the same time, we have to preserve and protect the right to privacy, regardless of any changes. There is a duty to do so even when things are still fuzzy and changing quickly. But the hype may distract or prevent us from enforcing an effective regulation. We definitely need a different approach to respond to those challenges in the context of big data or endless technological progress and guaranteeing privacy. Firstly, recognizing the limitations of what we have now as tools as outlined by the author is a prerequisite to finding the appropriate answers. Secondly, I would like to suggest a different perspective as a member of civil society and, perhaps more relevant here, as a software engineer concerned with building and designing sustainable systems. Therefore, I would like to use software engineering metaphors and principles. Big data is enabled by the progress of technologies such as the Internet of Things and cloud computing,
in addition to the current collection of data by different IT systems, regardless of their primary purpose or usage. However, existing data protection regulations or frameworks were designed to cope with “legacy” and traditional IT systems, processing personal data for specific purposes. Big data made a breakthrough in terms of using massive data from different sources, new technologies and platforms and adding more advanced algorithms and statistical and mathematical models.
An effective and enforc eable regulation needs to be at the same pace and aligned with what is regulated. Let’s make a comparison between the regulation and the issue to be regulated. Big data or cloud computing, for example, are about scalability and planning for the continuing expansion of storage, computing, networking resources and data, while regulation tends to respond case by case in ad-hoc manner to anything new (and usually when it is already too late). It is more a reactive than a proactive approach. What 51
we can conclude is: existing laws fail to “scale out”, to be interpreted or adapted to cover new use cases, new technologies and applications adequately. An effective and enforceable regulation needs to be at the same pace and aligned with what is regulated. So, can data protection and privacy borrow the scalability principle in order to be able to handle the next technology threat to privacy? Can data protection regulations be conceived and designed to be effective many years ahead? Can data protection laws be built iteratively and be evolutionary? Can the data protection framework be able to cope with evolving threats to privacy Yes, it is possible: if we “design” laws to include new cases, such as when a system can add a new component. It is possible if we make regulation iterative. A new law is already a “legacy” product when it starts to be applied. “Upgrading” data protection laws is a continuous effort; principles remain but responses change. The author advocates having a strong and flexible data protection legal framework, but that is far from being enough. Data protection authorities need to be able to predict changes, to plan for responses when needed. Using the software metaphor again, data protection must be updated regularly with new features to cope with the new realities, and the designer or the legislator in this case needs to do so often. On the other hand, even if the laws, regulations and legal frameworks are here, the question is about the capacity and readiness of data protection authorities. What about the skills needed in the human resources who are supposed to comprehend, understand and enforce the rules on diverse applications or instances of big data? Even businesses are having a hard time finding data scientists and experts on big data for building and using those oceans of data. Without adequate expertise, data protection agencies or commissioners will be unable to detect new infractions and irregularities. The author advocates privacy by design and giving control to users. Again, it is not enough. It is not just a problem of awareness or knowledge about privacy rules and practices by developers – it is about forgetting who matters: users. In agile software development, everything is about users. But when it comes to data usage and processing, companies or start-ups tend to focus on their business models and ignore the fact that they need to build user-centric and friendly systems first. Being user-centric must be systematic. When the focus is shifted to the right spot, privacy by design practices would be effective. 52
Finally, this raises a missing question that we need to respond to: how much innovation is permitted without undermining or threatening privacy? We must avoid framing privacy as antagonistic to innovation or lowering the standards of data protection to match the innovation. In fact, innovation should be thought of as a way to improve data protection, to strengthen privacy as a right in the Internet. Any innovation failing to do so is only a regression of rights and a backward move. So, can data protection regulations make big data as an innovation beneficial to users and citizens?
RESPONSES CIVIL SOCIET Y
Is data protection becoming incompatible with communication? LORENA JAUME-PALASÍ, INTERNET & SOCIETY COLLABORATORY, BERLIN
Big Data is a product of human thinking and its natural avidity for knowledge. The principle of data “abundance” is nothing new to mankind. Since the very beginnings of civilization, humans have collected and conserved culture, traditions or wisdom in libraries and archives. The impetus to store operated with different criteria and categories so as to classify the data. It would try to collect things which seemed, at first sight, unstorable and to figure out ways to make them collectable: songs would be stored by means of words and notes in manuscripts, historical moments would be collected in words and pictures, etc. No information was therein irrelevant: from objects used in daily life, to discussions held in the agora over sacral books: archives would welcome all pieces. Access to these collections was in ancient times restricted to the political elite: the library and state archives of the Epang Palace to the Chinese emperor or the library of Alexandria with respect to the Pharaoh. In the course of time, human acquisitiveness towards information did not change, but the access to it did – and, this brought both political turmoil and changes in the path towards more democratic structures; even in those times in history when no right to privacy existed.
Human acquisitiveness towards information did not change, but the access to it did. The better and more automated the storage techniques, like the printing press, the more information stored. The logic of the Internet follows exactly this impetus and, as
all new technologies that have come along throughout history, it gave this impetus to store a boost. Peter Schaar falls into the trap of technological mythologization when he characterizes Big Data as the “new oil”, the novelty of it relying in storing for the sake of storing. Information was always power – this is why in earlier, non-democratic times, its access was restricted to elites. And information storing has always been innate to humankind – for the sake of knowledge. In doing so, storage was the first step to experiment; next were data correlations, while possible explanations were a third step. This process has been one of the scientific methods by which polymaths and humanists have understood nature for centuries. Collect first, correlate afterwards, and subsequently search for possible explanations based on the correlation. Peter Schaar also avoids finishing the history of the juridification of data protection until our present days. Hence, I will continue with the description of the historical evolvement of data protection law and, then, proceed with the problems that Peter Schaar’s demands would pose. With the evolution of the Internet, automated data processing became a practice not only for governments and big companies, but for all. With the use of smartphones, social media, emails, apps, etc., individuals also process data automatically every day, both deliberately and unknowingly. Every conversation or statement on Facebook is data. Every Foursquare check-in, with or without mentioning friends, is data. Pictures, videos, Likes, the sharing of a status or photograph is data. What nowa53
days is understood as communication is also categorized data. This does not only mean that the original purpose of data protection has changed – to protect citizens from the asymmetric power of data-gathering governments and big companies, – but also that the concept of automated data processing itself has changed. Automated data processing is now a constitutive part of modern communication. Moreover, in the countries where data protection laws were drafted there already existed laws regulating freedom of expression – and thus communication. So, suddenly, these now have two different corpora of law regulating the same. The concept of data protection was, from a philosophy of law point of view, already semantically misleading and technically problematic from the very start. The object of juridical protection was not clear. If the right to privacy was already contemplated as a different right, what is then the object of the juridical protection of data protection? Data? For what sake? What are the values that had to be strengthened and the harms that needed to be averted? Data (in Latin, “something given”) is per se a neutral product of human action and, hence, in principle neither good nor bad. I agree with Peter Schaar when he states that Big Data constitutes a big challenge for data protection: these new technological developments prove how inadequate those principles were and continue to be.
This is neither how the Internet functions, nor how human beings are. Law should not regulate on the basis of technology models or novelties, but on the basis of principles. However, technologies may be used as a litmus test to expose the (in)adequacy of a regulatory principle. This is precisely what new technologies, like the Cloud, the Internet of Things or Big Data are doing: Peter Schaar is stuck in the old paradigm of automated data processing. Thus, he concentrates on constricting data preventively by demanding data minimization – instead of concentrating on the values resulting from data in need of juridical protection. Considering the history of human knowledge and information production as well as the innate impetus to store, data minimization demands a change in the nature of interaction and communication. This is neither how the Internet functions, nor how human beings are. Data minimization generates silos of information and privileges by selecting some information and discarding other, and then creating contextual gaps that might entail higher risks of manipulation and misuse. 54
Data minimization inverts the dynamics proper to the nature of information and knowledge that gradually led to democratic development. The same applies to Peter Schaar’s appeal for informational self-determination and consent as the primary instruments for protection. They sound good at first, but have detrimental consequences for individuals upon closer inspection.
Whose data is the knowledge of my neighbor having a broken leg? Informational self-determination in data protection implies ownership of the data produced by an individual. The owner of the data is who determines what is done with it. However, ownership of data has always been difficult to determine, even outside the digital dimension: if I go to the bakery that is in front of my home, every day, early in the morning, who owns this data? Me, the baker, the neighbors also present at the bakery, all of us? Should it be forbidden for the neighbors to tell someone else about this data? Much “sensible” personal data cannot be hidden or kept behind closed doors: a pregnancy after a few months, a person with a broken leg wearing crutches, the religion of a woman wearing a headscarf. Whose data is the knowledge of my neighbor having a broken leg? His data, because it is his leg? My data, because I saw it with my eyes? Am I allowed to talk about the broken leg with the baker or with a stranger? As explained above, data comes etymologically from Latin, meaning “something given”; there is a semantic connotation of detachment immanent in the word “data”. The concept of ownership provokes more conflicts than orientation and clarity. It may even open the door to censorship. And, ultimately, it raises once more the question about the adequacy of this principle and the purpose, that is to say, the object of juridical protection of this corpus of law in general. Regarding Peter Schaar’s appeal for consent, it should be noted that consent – under circumstances of fairness – requires full knowledge and understanding. An individual cannot freely sell his or her personal freedom – even if he or she gives his consent, for no individual can surrender this fundamental right. Why should it be different with other fundamental rights? Consent in data protection reassigns the responsibility from the companies and states running the data manipulation and gathering algorithms – which they usually have designed with the utmost untransparency – to the
RESPONSES CIVIL SOCIET Y
Credit: Lorena Jaume-Palasi https://flic.kr/p/n6TcLi | CC BY 2.0 https://creativecommons.org/licenses/by/2.0/
individuals. Individuals are, thus, expected to know as much as algorithm specialists and programmers do: once the code is made transparent and the terms and conditions have been explained clearly in a one pager, citizens should be able to decide by themselves. Peter Schaar seems to ignore that Big Data and data protection are both complex issues. They cannot be explained in one page. A short explanation is a shortage of information. Furthermore, algorithm transparency still remains intransparent for non-technicians. Users know less about the algorithms of a company than the company itself. Recapitulating, information shortages and knowledge asymmetries are not the basis for fair consent. Turning the responsibility that governments and corporations should have over to users does not protect these individuals: it merely perpetuates power asymmetries between individuals and states or companies while giving the illusion of autonomy. Big Data is not incompatible with privacy or anonymity protection, but it is incompatible with old paradigms of automated data processing and an outdated understanding of the Internet.
Peter Schaar does identify one of the most relevant challenges and risks of Big Data, however: predictive algorithms and analytics restrict individuals’ freedom of choice, since they preselect in advance a reduced number of options that come to erode autonomy in the long term. Predictive analysis filters the environment of the user; it tells the individual what he or she is and embeds him or her in anticipatory obedience to a setting where the individual remains what he or she is supposed to be. Consequently, predictive algorithms could perpetuate social inequality. Consent or informational self-determination would be of no help in this case, for they apply after, and not before, the pre-selection. Only rules on freedom from discrimination would prevent misuse. Big Data entails more risks than a mere erosion of privacy and, hence, data protection laws need to consider them in their entirety. Data protection must ascertain, first, which values it wants to safeguard, which principles may be harmed and are, thus, in most need of protection, so that it may, subsequently draft not data- but principle-oriented regulations. Freedom from discrimination, the right to self-development, and freedom of choice are principles and values; data is not. 55
RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y
Responses Technical & Academic Community
JONNE SOININEN, INTERNET ENGINEERING TASK FORCE MIKHAIL M. KOMAROV, NATIONAL RESEARCH UNIVERSITY HIGHER SCHOOL OF ECONOMICS, MOSCOW RICHARD HILL, AUTHOR, FORMER ICT MANAGER
The Current State of Internet Security From A Technical Perspective JONNE SOININEN, INTERNET ENGINEERING TASK FORCE
The recent revelations of the pervasive monitoring by security agencies including the NSA and GCHQ sent shockwaves through the Internet technical community. Though it was hardly surprising that organizations whose main purpose is to monitor and analyze communication were actually performing their task, the scale and tactics of those activities did surprise the technical community. The revelations served as a wake up call to the technical Internet community to focus more time and work on Internet security. As a reaction to the revelations on pervasive Internet monitoring, the Internet Engineering Task Force (IETF) held a session about pervasive monitoring in the technical plenary in Vancouver in November 2013. There was clear consensus among the participants that more could and had to be done in the Internet protocols to increase security in the Internet protocols. However, the topic of Internet security is by no means a new topic in the Internet technical community. Though it is often stated that the Internet was not originally designed with security in mind, the IETF has had a significant focus on security and privacy for decades. A testament to it is the RFC1984 published as early as 1996, in which the Internet Architecture board and the Internet Engineering Steering Group state that the IETF will work on securing its technologies with encryption regardless of government restrictions on encryption technologies. Over the years, the IETF has specified technologies, such as IPSec and Transport Layer Security, to secure communications over the Internet. These technologies are widely deployed and 58
used routinely on the Internet. In addition, the IETF and the IAB increased the focus on privacy in Internet protocols even before the Snowden revelations. The IAB published guidance on privacy considerations for Internet protocols in RFC6973 in July 2013. Hence, the IETF did not start to work on Internet security and privacy in the aftermath of the Snowden revelations. However, the focus on security was further increased.
Though extensive tools for Internet security have been available for a long time, many people have not been using them. An old proverb says you can lead a horse to water, but you cannot make it drink. The same is true of Internet security. Though extensive tools for Internet security have been available for a long time, many people have not been using them. Sometimes the privacy and security aspects have not been considered important enough in the tradeoff, for instance, between increased security and increased computing power needs. However, over recent years major Internet service providers have started to use technologies such as TLS by default to secure their services. This development is very encouraging and there is hope that others will increasingly follow this trend as well. These available security mechanisms do effectively secure communication over the Internet between the service and its users.
RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y
In light of the Snowden revelations and the Heartbleed bug, it might seem counterintuitive to state that the Internet is more secure than ever, and continues to become more secure as new technologies are developed and those technologies already developed are deployed. However, looking purely from an Internet communications angle, this statement is true. In addition, the new increased focus on security in the Internet will only strengthen this development.
The Internet is international in nature. In addition, in the discussion about pervasive surveillance, questions have arisen about the security of Internet routing and what traffic flows through countries that perform pervasive monitoring of Internet traffic. There has been a general call for enhancing routing security, which Peter Schaar mentions in his article. There has even been a call from certain European political leaders for a European Internet with greater security envisioned. Although in the early days of the commercial Internet much of the traffic did actually go via the US, today local traffic does stay local. The introduction of local Internet Exchange Points in countries and peering agreements between local operators have assured this in Europe for over a decade already. The same trend is seen all over the world, including more recently in developing countries. Today, there is no technical reason why local Internet traffic from any European country should or would pass through any other country. As these statements about the technical state of Internet security and routing are both positive about the current situation and hopeful about the future, the question may arise as to how it was possible for the NSA, for instance, to perform extensive surveillance even on foreign citizens. The answer is in the popular services we use. These services are provided mainly by US-based companies. These organizations fall under local legislation and have had to hand over information to local agencies. The Internet does not inherently leak this information but the information is obtained directly from the service provider. Hence, the issue is that we the users provide the information to these organizations by using their services. In his article, Peter Schaar also raises the issue of the significant market power of some of these Internet players. Commercially, these players are very strong and they may have significant market power, at least in the western hemisphere. However, the reasons these companies have become significant
are not rooted in Internet technology. As a matter of fact, Internet technology enables a level playing field for competition and local alternatives exist and are widely used in many regions. Therefore, the users can choose between services and service providers. We have to ask why users, regardless of privacy implications, continue to use the currently popular services and why there is a lack of viable alternatives in Europe. The Internet is international in nature. The packets pass over national borders as easily as they do within a country or a region. This also causes legal friction between countries and regions when services are provided outside a jurisdiction. Despite the issues, the international nature of the Internet is inherently a good thing. It is one of the key reasons the Internet has become the universal data network. As Peter Schaar states in his text, the reactions to pervasive surveillance may try to start restricting the international nature of the Internet. In addition, there is a risk that the example of pervasive surveillance increases the interest of other nations to start similar programs themselves. These are real risks to the Internet. The increase in computer processing power and storage capacity has revolutionized data processing over the last decade. Currently, there seems to be little technical restriction on storing data in an always-available format over extended periods and processing it almost in real time for different purposes. The data stored for a certain purpose can be used for a completely different purpose than originally intended as business models or political climates change. As Peter Schaar points out in his text, the right to privacy is an extremely important human right. This includes the right to privacy on the Internet. Taking the current technical capabilities into account, the right to privacy is perhaps more important today than it was ever before. Therefore, it is absolutely vital for the trustworthiness of the Internet that progress in developing new security technologies and deploying those already specified continues and even accelerates. In addition, users need to be aware of the implications of their actions for their privacy online. The Internet technical community continues to develop the technologies for increased security and improved privacy. In addition, the public focus on the Snowden revelations has created more awareness of privacy on the Internet. This technical progress and increased awareness can lead the horse to the water. We can only hope it will also drink.
59
Credit: http://www.nsa.gov/about/photo_ gallery/gallery.shtml http://goo.gl/70KGE2
60
RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y
Big Data leads to new international data processing policies MIKHAIL M. KOMAROV, NATIONAL RESEARCH UNIVERSITY HIGHER SCHOOL OF ECONOMICS, MOSCOW
I would like to agree wholeheartedly with Peter Shaar’s paper “The Internet and Big Data – Incompatible with Data Protection?” and particularly with his proposal of developing new data processing policies. We live in a world where technology develops fast. Unfortunately, we usually face big delays between the introduction of new technologies to the mass market and people around the world and the evaluation of the influence of those technologies, either on ecology or on the lives of human beings, including in terms of privacy or personal data protection. As a representative of the academic community, I find myself in a similar situation when a new technology is developed but is then followed by a considerable delay in developing a new educational course which teaches others how to use that technology properly. Over the last twenty years, technological development has overtaken the policy-making process, and usually
we face a problem first and then try to solve it. I think we encountered a problem when Web 2.0 was developed and implemented quickly in our lives. We realized it only after the Snowden leaks and now we are trying to propose mechanisms to solve it. We understood the information we were sharing with our friends and relatives via many Internet services was available not only to its target audience. We should also understand and accept the fact that information is now one of the most critical resources and it is on the same level as oil and gas. It is also necessary to remind ourselves of the differences between the terms “information” and “data”. Data is the source of information, and what kind of information and how much information we can get from the same amount of data depends on algorithms, i.e. the mathematical methods we use, as well as on the skills of the people processing that data. That is why when we are talking about privacy, we usually mean information about us. But in terms of the policy-making process, we 61
are focused on processing data which consists of some of our personal information. I think it is quite important for our understanding that Mr. Shaar mentioned “most of the current data protection rules and regulations focus on the individual procedure used for data processing” and that “data protection regulations consider data processing from a micro perspective: single pieces of data, an individual algorithm, a specific purpose. Today, companies and public bodies see data processing more and more from a macro perspective”. This is where the line is drawn – on one side, we should process the data in order to make efficient prognoses, improve processes and improve our quality of life; on the other side, while processing the data, we should be responsible for an individual’s personal information contained in that data (or someone’s personal data). That is why data processing and the data analysis process on the macro level should be standardized and regulated much more strongly than on the micro level (on the level of individuals). Governments already collect lots of data about individuals for different purposes and this is where we have to start improving personal data processing regulations. We do not have the individual at the heart of the system of data processing, but business or governmental issues, and most of the regulations are comfortable enough for them; however, the situation should be changed and I agree wholeheartedly with Mr. Shaar – parliaments should be introducing those changes.
Technological development has overtaken the policy-making process. I would also like to go further with technological development – from Web 2.0 to Web 3.0. Today, there are plenty of ways to prevent the use of particular websites: by including them in databases, by requesting a password and by only allowing access to certain websites (such as using parental controls). But there are ten times more ways to bypass all these safeguards. The important thing about Web 3.0 is that the resulting information may be counterfeit or misleading, depending on its popularity in society, which is sometimes not correct. It was once mentioned that, from a data protection perspective, one of the main aims of the Semantic Web and Web 3.0 was to make data easier to process and re-use. However, this leads to the question of what becomes of the protection of personal data in such an open, universally accessible web of interlinked data? This is particularly important because applications according to Web 3.0 are likely to be far more effective than even traditional search engines at piecing together personal data, thus increasing the risk of 62
identity theft. This leads to special requirements for safeguards to protect user data, as well as policies to ensure people understand how their information will be used. It is necessary to say that we almost non-protected at the Internet from the non-appropriate information, unfortunately there might be a one-look rule – when you will see something once and obtain that information just to build policy for the future, not to show that information, because we don’t have special governmental or international policies against placing that information on the Internet. In terms of Web 3.0, when our things will use information from the Internet, or generate the information and send it to the Internet, we should specify the policy and special agreement of connecting things to the Internet; there should be a clear identification field which would point to the particular person to whom the thing belongs, otherwise we will have lots of uncontrolled information generators – bots – which will influence the dissemination of information.
We should specify the policy and special agreement of connecting things to the Internet. We should think about “privacy by design” issues and probably special certification for systems dealing with personal data. I, too, would like to support the initiative of “open interfaces to enable communications between members and non-members” and I think there is a good example explaining how it works with regard to terms and conditions and our privacy: the movie “Terms and Conditions May Apply” by Cullen Hoback. We should not fear Big Data concept development and the implementation of new technologies in our lives, but we should allow individuals to be excluded from all the analytical and statistical processes at any time. Due to fast growth in the technological field and in the amount and type of data on the Internet, reaction from the legal side has been slow, resulting in a lack of laws and policies to protect our privacy. It is the goal of the international community to jointly update current laws regulating data and information dissemination policy (including on the Internet). How long would it take to arrange joint international action?
RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y
Schaar is both profetic and mainstream RICHARD HILL, AUTHOR, FORMER ICT MANAGER
Peter Schaar’s excellent and well-thought-out paper is at once prophetic and mainstream. It is mainstream because it reminds us of fundamental values that were clearly enunciated during the Age of Enlightenment, and it is prophetic because those very same values have recently been reaffirmed by political leaders in United Nations Resolutions and in a judgment of the European Court of Justice. Schaar reminds us why so much data is being collected and analysed for private purposes: “… mass data is seen as an asset, the ‘new oil’ of the information society.” Indeed, consumers obtain “free services” if they agree to allow the service provider to use their data. The data are used to target advertising. But the revenue derived from the advertising is far greater than the cost of providing the “free” service, so in fact users are paying for the service (albeit not with money) and are receiving in return far less than the value of the information they provide.1 To some extent, this situation is a consequence of the funding model for Internet traffic flows, where the receiver pays, and there are no pervasive “pay-per-use” mechanisms. So providers of services other than access have developed advertising as their main revenue stream. As Schaar correctly notes, network effects and economies of scale lead to concentration, so there is often a 1 A summary discussion of this situation, with references to more detailed work, is found in section 6 of Hill, Richard, 2014. “The Future of Internet Governance: Dystopia, Utopia, or Realpolitik?”, in Pupillo, Lorenzo (ed.), The global Internet governance in transition , Springer (forthcoming)
dominant provider of a particular service (e.g. a social network) and users have no choice but to accept the terms and conditions of that dominant provider. I join Schaar in calling for such service providers to be brought under effective competition control, for example, by enforcing a “right to portability” and by envisaging measures to avoid the abusive use of data collected by the smartphone software known as apps.
I would thus call for the elimination of restrictions on encryption. As Schaar correctly points out, mandatory minimum privacy standards are, in this context, analogous to mandatory safety standards that we take for granted. Schaar reminds us that informed consent is a condition for the processing of data. The service providers referred to above do obtain the consent of their users, but this is done by a contract of adhesion whose terms and conditions are often very long and are often not read in detail by users. It seems legitimate to wonder whether there really has been informed consent for the use of the consumer’s data. Schaar reminds us that data can be used in unexpected ways, and I would add that no database is entirely immune from theft: an insider can copy a large amount 63
of data and sell it, thus violating the terms and conditions under which the user provided the data. Schaar rightly notes that “it is unacceptable that governments and intelligence agencies are abusing the increasing international data transfer for bulk access to the transmitted data”.2 He calls for greater use of encryption, but unfortunately certain types of strong encryption are restricted by laws or regulations. I would thus call for the elimination of restrictions on encryption. Indeed, the very design of the Internet was based on the assumption that there would be end-to-end security3, so its pervasive implementation would seem to be a necessity. As Schaar notes, “there is a need to establish binding international data protection standards guaranteeing the protection of private life, as laid out in Art. 12 of the United Nations Charter of Human Rights”. I agree with Schaar that a binding instrument is needed; that is, a treaty. Of course, a treaty can only be agreed by member states, and those who favor discussing Internetrelated issues in the less formal and more open process usually referred to as the “multi-stakeholder” model4 should accept that those discussions can precede, but cannot replace, formal intergovernmental processes. For example, an attempt was made recently, at the April 2014 NETmundial meeting, to tackle the issue of mass surveillance, but all that was agreed was to restate text that had been previously agreed at the United Nations. This meeting had been convened largely to discuss the matter of mass surveillance, so its failure to propose steps to curtail mass surveillance was considered disappointing.5 However, that is not entirely a fair assessment; while the UN Resolution was agreed only by states, the NETmundial text was agreed by a broad coalition of governments, civil society, private sector, academia, and technical experts. So it has broad support and should influence more formal discussions.
2 For a more general critique of certain current Internet practices, see Hill, Richard, 2013. “Internet Governance: The Last Gasp of Colonialism, or Imperialism by Other Means?”, in Weber, R. H., Radu, R., and Chenou, J.-M. (eds) The evolution of global Internet policy: new principles and forms of governance in the making?, Schulthess/Springer 3 See section 4.2.5.2 of Hill, Richard, 2014. “The Internet, its governance, and the multi-stakeholder model”. Info, Vol. 16 No 2 4 See section 5 of Hill (2014) 5 O’Brien, Danny, 2014. “Human Rights Are Not Negotiable: Looking Back At Brazil’s NETmundial”, Electronic Freedom Foundation, 25 April 26 April 2014; GIP team, “Why NETmundial mattered and what was achieved”, Geneva Internet Platform, 24 April 2014 <http://giplatform. -
64
Following up on NETmundial, I would propose that the matter be taken up in the ITU, whose Constitution has always had a provision on secrecy of telecommunications (Article 37). That provision is too weak, but it can be strengthened as follows: 1. Member States agree to take all possible measures, compatible with the system of telecommunication used, with a view to ensuring the secrecy of international correspondence. 2. Nevertheless, they reserve the right to communicate such correspondence to the competent authorities in order to ensure the application of their national laws or the execution of international conventions to which they are parties. However, any such communication shall take place only if it is held to be necessary and proportionate by an independent and impartial judge. 3. Member states shall respect the secrecy of telecommunications in accordance with both their own laws and the laws of the state of the originator of such correspondence. As Schaar so rightly says, “the pursuits of liberty and prosperity, free discussion and inclusion, closely linked to the information society, are at stake. There is a need for a broad coalition to defend these values.” Indeed, I find it surprising that we seem to have forgotten fundamental principles that were formalized more than 200 years ago and repeatedly reaffirmed since then. The Fourth Amendment of the US Constitution, drafted in 1789 and approved in 1791, states: The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized. Article 12 of the Universal Declaration of Human Rights (UDHR) and Article 17 of the International Covenant on Civil and Political Rights state (in pertinent part): No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence; … In accordance with Article 29 UDHR: In the exercise of his rights and freedoms, everyone shall be subject only to such limitations as are determined by law solely for the purpose of securing due recognition and respect
Defense Advanced Research Projects Agency (DARPA) | http://goo.gl/OwF8fq
RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y
for the rights and freedoms of others and of meeting the just requirements of morality, public order and the general welfare in a democratic society. Dilma Rousseff, President of Brazil, in her 24 September 2013 speech at the United Nations, stated: In the absence of the right to privacy, there can be no true freedom of expression and opinion, and therefore no effective democracy. United Nations General Assembly Resolution A/RES/68/167 of 18 December 2013 on the right to privacy in the digital age reaffirms the right to privacy; emphasizes that unlawful or arbitrary surveillance violates that right; expresses concern about mass surveillance; and calls on states to respect privacy and to review procedures, practices and legislations, including with respect to mass surveillance. On 8 April 2014, the European Court of Justice struck down the European Data Retention Directive on the grounds that the data retention was not limited to what is strictly necessary and that it exceeded the limits imposed by compliance with the principle of proportionality.6 On 10 April 2014, the European Union Article 29 Working Party adopted Opinion 04/2014 on surveil-
lance of electronic communications for intelligence and national security purposes.7 That Opinion states: From its analysis, the Working Party concludes that secret, massive and indiscriminate surveillance programs are incompatible with our fundamental laws and cannot be justified by the fight against terrorism or other important threats to national security. Restrictions to the fundamental rights of all citizens could only be accepted if the measure is strictly necessary and proportionate in a democratic society. So Schaar’s call for action is not isolated. As citizens, we must insist that our parliaments take action, both to stop mass surveillance by governments and to curtail the power of dominant service providers to obtain data from customers and use it as they see fit to generate large profits. And we must insist that the Necessary and Proportionate principles8 supported by a large number of organizations and scholars be implemented, despite the resistance shown at NETmundial by the US and its allies such as Sweden to calls for the implementation of those principles. Paraphrasing what was said by former Spanish judge Baltasar Garzón in a speech in Geneva recently, it is time to stop debating the legality of what is manifestly illegal.
6 http://curia.europa.eu/juris/document/document.jsf? doclang=EN&text=&pageIndex=1&part=1&mode=req 8 https://en.necessaryandproportionate.org/text 65
Authors
1. GEORG APENES
Georg Apenes is a Norwegian jurist and was appointed director of the Norwegian Data Inspectorate in 1989. There he made a mark in the political debate as a defender of privacy. When commenting on Internet privacy, Apenes deplored the indifference with which people disseminate personally identifiable information. He stepped down in April 2010. Georg Apenes has authored several books, with topics spanning from monographies on political themes and analysis of political parties to festschrifts and amateur history. In addition, he has written columns in the newspapers Fredriksstad Blad, Stavanger Aftenblad, Dagens NĂŚringsliv and A-Magasinet. He served three terms (elected 1977, 1981 and 1985) in the Parliament of Norway. In 2010 Apenes was appointed Knight, First Class of Order of St. Olav, for his work.
2. NICK ASTHON-HALL
Nick Asthon-Hall is the senior permanent representative of 66
the for-profit technology sector to the UN, its member-states, and the international organisations resident in Geneva. He has been an active part of multilateral policy development starting with the sustainable development agenda for the worldâ&#x20AC;&#x2122;s cities (HABITAT 11) in 1992, has been an active part of the Geneva community for 14 years and a resident of it for the past eight. He worked in the music industry as a manager to artists such as James Brown. Also, in the tech sector he went from a Systems Administrator post to CIO/CTO in five years and has broad hands-on technology experience from running a small local area network to designing multi-country wide area networks. He is currently Executive Director of the Internet & Digital Ecosystem Alliance (IDEA). Prior to that he was Geneva Representative of the Computer & Communications Industry Association (CCIA), Director for AtLarge and Senior Director for Participation and Engagement with the Internet Corporation for Assigned Names
AUTHORS
BILD LOR
and Numbers, Inc. (ICANN) and Executive Director of the International Music Managers Forum (IMMF)
3. RAFIK DAMMAK
Rafik Damak is an engineer working and living in Japan. He is member of the steering committee for the Dynamic Coalition on Internet Rights and Principles as well as 1net, representing civil society. He has been involved in the ICANN community as NCUC (Non-commercial users constituency) individual user member. He is also a former elected GNSO Councillor for the Non-Commercial Stakeholder Group and a member of ICANN’s Nominating Committee (NomCom) who is responsible for selecting eight members of the Board of Directors and other key positions within ICANN‘s structur. In addition he participated in several ICANN working groups like the new generic top-level domain (gTLDs) applicant support. He was elected as chair of the Non-Commercial Stakeholders Group (NCSG),
5. RICHARD HILL
who represents, through its elected representatives and its Constituencies, the interests and concerns of noncommercial registrants and noncommercial Internet users of generic Top-level Domains (gTLDs). He is working on improving awareness about Internet Governance in Tunisia and MENA in general.
Dr Richard Hill is an independent consultant and was the Secretary for the various ITU groups that discussed the revision of the International Telecommunications Regulations (ITRs). He was the head of the secretariat team dealing with the substantive issues at the World Conference on International Telecommunications (ITRs). He was part of the secretariat team for the World Summit on the Information Society (WSIS) and has been involved in Internet governance matters since the mid 1990’s. Prior to joining ITU in 2001, Richard was Department Head, IT Infrastructure Delivery and Support, at Orange Communications (a GSM operator). He also worked at Hewlett-Packard‘s European Headquarters in Geneva, Switzerland. Richard holds a Ph.D. in Statistics from Harvard University and a B.S. in Mathematics from M.I.T. Prior to his studies in the U.S.A., he obtained the Maturita‘ from the Liceo Scientifico A. Righi
4. SUSANNE DEHMEL
Susanne Dehmel is Head of Department Data Protection at the Federal Association for Information Technology, Telecommunication and New Media (BITKOM) since 2010. She is a lawyer and studied in Passau, Freiburg and Cardiff. Before taking over the Data Protection Department she was responsible for Copyright Law and Intellectual Property Issues from 2002-2009. Encouraging the development of a modern and practicable data protection law for the information society is an important part of her current work.
67
ENA
in Rome, Italy. He has published papers on Internet governance, mediation, arbitration, and computer-related legal and intellectual property issues and the standard reference book to X.435.
6. LORENA JAUME-PALASÍ
Lorena Jaume-Palasí is a lecturer and PhD candidate at the Department for Practical Philosophy at Ludwig-Maximilians-University Munich. Her main focus are moral conflicts in international relations and new technologies in governance structures, as well as strategies of collective actors and collective rationality. She has co-organized the German Youth-IGF, the New Media Summer School and the 2014 German IGF. She has been coordinating Global Internet Governance working groups at Berlin’s Internet & Society Collaboratory, where she also leads projects and partnerships since June 2014. She occasionally writes for the online magazines, such as irights.info
7. WOLFGANG KLEINWÄCHTER
Wolfgang Kleinwächter has been involved in Internet Governance for decades and has participated - in various capacities - in ICANN, the Internet Governance Forum (IGF) and WSIS (World Summit on the Information Society). In the WSIS process he was a member of the Civil Society Bureau, co-chaired the Internet Governance Caucus (IGC) and was appointed by UN Secretary General Kofi Annan as for the UN Working Group on Internet Governance (WGIG). He is a co-founder of the European Dialogue on Internet Governance (EURODIG), the Global Internet Governance Academic Network (GIGANET), the Summer School on Internet Governance (SSIG) and chair of the ICANN Studienkreis. Also, Kleinwächter is a Professor for International Communication Policy and Regulation at the Department for Media and Information Sciences of the University of Aarhus in Denmark where he teaches “Internet Pol-
icy and Regulation” since 1998. He is a founding member of the Collaboratory, and was appointed to the ICANN Board of Directors in 2013.
“Wireless Interactive SystEms and NETworks” (www. wisenetlab.ru) and Co-founder of the Allrussian Public Organization “Young Innovative Russia” (www.i-innomir.ru)
8 MICHAEL M. KOMAROV
9. JAN MALINOWSKI
Michael M. Komarv is Associate professor of the National Research University Higher School of Economics in Moscow and Deputy dean for international relations at the faculty of business informatics. He accomplished a PhD at Moscow State Institute of Electronics and Mathematics in 2012. Komarov is also specialist in wirelss sensor networks and ICT and was awarded with grants for the best scientific projects, medals for the best scientific projects and IT integration to the educational system. He is member of the Technical Committee on Business Informatics and Systems IEEE and was a Speaker at the Workshop of the Internet Governance Forum in 2012/2013. He is also Head of the Interuniversity Laboratory for innovative projects
Jan Malinowski is a lawyer, qualified in Spain and in England. Following eight years of professional practice in Barcelona and London, He joined the Council of Europe where he worked for eleven years with the anti-torture watchdog. Since 2005, he has been responsible for Council of Europe work on media policy, freedom of expression and Internet governance. This work has resulted in the adoption by the Organisation‘s 47 member states of a number of ground-breaking human rights-based normative texts, including a new notion of media, a commitment to do “no harm” to the Internet and the acknowledgement of the states’ shared responsibility for preserving the integrity and ongoing functioning of the Internet. As Head of the Information Society 68
Department, he is now also responsible for work related to two unique Council of Europe conventions, on data protection and cybercrime.
10. STEPHANIE PERRIN
Stephanie Perrin spent 30 years in the Canadian federal government, working on information policy and privacy issues. She was Director of Privacy Policy responsible developing private sector privacy legislation (PIPEDA), leaving in 2000 to work for Zero Knowledge Systems, to promote technology for anonymity on the Internet. She is a PhD candidate at the University of Toronto Faculty of Information, with research interests focusing on why privacy is not implemented in Internet standards and functions. She is a member of the Expert Working Group at ICANN, tasked with revamping the Whois directory, and her research examines why privacy has developed into such an intractable problem at ICANN. This research examines concepts of identity online, and the inadequacy of current privacy norms.
AUTHORS
11. GEORGE SALAMA
George Salama is Senior Manager for Public Policy at the SAMENA Telecom Council and is responsible for setting up and executing the Council’s public policy plan that includes broadband development, ICT policy, spectrum management, digitization, and Internet governance. Salama spent over six years at Egypt’s National Telecom Regulatory Authority (NTRA), International Technical Coordination Department, and was in-charge of Internet public policy issues on national, regional and international levels. He was also part of the Egyptian government delegation to the Internet Governance Forum 2007 to 2010. After completing his Bachelor of Science with a major in Computer Science and a minor in Electronics from the American University in Cairo, AUC, Salama completed his Master’s of Science degree in Business Information Technology from Middlesex University, UK, in 2008. Salama is currently a part-time PhD researcher
with Tampere University, Finland, and is pursuing his thesis on “Internet Governance – Intergovernmental Model vs. Multi-Stakeholder Approach.”
Working Group on Data Protection in Telecommunications (IWGDPT), the Hamburger Datenschutzgesellschaft (HDG, Hamburg Society of Data Protection) as well as the Humanistische Union (Humanistic Union). Peter Schaar is the laureate of the eco Internet AWARD 2008.
12. PETER SCHAAR
is Chairman of the European Academy for Freedom of Information and Data Protection (Berlin) and Former German Federal Commissioner for Data Protection and Freedom of Information. Holding a diploma in economics he worked from 1980 to 1983 with the Senate’s office for administrative services Hamburg. 1986 to 1994 Schaar worked as head of section with the Hamburg Data Protection Commissioner. He was deputy there from 1994 to 2002. In 2001 and 2002 he was a dedicated member of the Commission set up to accompany the modernization of the Data Protection Law. On November 1, 2002 he founded a consulting company for data protection. His further engagements are covering the Gesellschaft für Informatik (Society for Informatics), the International
13 . PETRA SITTE
Petra Sitte is Chief Whip of the Parliamentary Group DIE LINKE in the German Bundestag since 2013. She is a Member of the Bundestag since 2005 and was a member of the Bundestag’s Commission of Inquiry on Internet and Digital Society (2010-2013). Since more than two decades her areas of public policy have been science, technology and innovation. She holds a doctorate in political economy.
13. JONNE SOININEN
Jonne Soininen is Associate Technical Director at Broadcom based in Helsinki, Finland. Prior to Broadcom, he worked in different positions with Nokia, Nokia Siemens Networks and Renesas Mobile being active in the technical community for over 15 years. During these years Jonne has been active both in technical and policy organizations including the Internet Engineering Task Force (IETF), the Internet Corporation of Assigned Names and Numbers (ICANN), the Internet Society (ISOC) and the Internet Governance Forum (IGF). Currently, Jonne Soininen is serving as non-voting technical liaison to the ICANN board appointed by the IETF.
* All author pictures reprinted with permission. Except Peter Schaar: Alexander Klink | http://commons.wikimedia.org/wiki/File:Peter_Schaar_%282013%29.jpg#filelinks | https://creativecommons. org/licenses/by/3.0/de/ | CC-BY-3.0
69
About the Internet & Society Collaboratory
The Collaboratory is a non-partisan laboratory for the digital society. As a multistakeholder platform we facilitate projects and debates about the challenges of digitization for our society. Following an open and transparent approach the Collaboratory tries to find solutions for answers in the areas of governance and regulation, transformation of privacy and publicness, copyright and innovation, transformation of work and industry, cultural heritage and education, globalization and security. As a community of practice the Collaboratory is open to anyone wishing to contribute constructively to the public discourse. Over 350 experts from various sectors are active in the Collaboratoryâ&#x20AC;&#x2122;s network. We constantly develop new formats and develop projects with various partners to enrich the debate and provide solutions to the community. The Collaboratory is a registered not-for-profit organization based in Berlin since 2012. A small team heads the ongoing projects under the auspices of a steering group and an advisory board. Membership is not required for participation in our activities. The platform is funded through donations and partnerships. Under our far-reaching transparency policies, you can find all information on our website on financial resources, people, supporters and results. The Collaboratory has its roots in the project initiated in 2010 by Google Germany who remain a key supporter of our platform.
70
Photo: Tobias Schwarz. CC BY
As an open platform we welcome the participation of experts from all areas and the support of businesses, associations, foundations and academic institutions for our work. We are constantly open to new partnerships and we actively seek funding. Contact us if you wish to support the Collaboratory. Your direct contact: Sebastian Haselbeck, managing director sebastian@collaboratory.de
MIND needs your support This Discussion Paper Series is one of the Collaboratory’s most successful projects and internationally renowed. MIND reaches the entire internet governance community and has established itself as a valued contribution to the discourse. It was an experiment when we first released it. Today, its editor is an ICANN director, and stakeholders from around the world read, contribute to, and ask to participate in this interdisciplinary platform.
Talk to us if your organization wants to contribute responsibly to the discourse. We are looking forward to discussing new partnerships. Sufficient funding provided, MIND #8 is scheduled for this September’s IGF in Istanbul. Internet & Gesellschaft Collaboratory e.V. Bank: GLS Bank Berlin IBAN: DE79 4306 0967 1141 7119 00 BIC: GENODEM1GLS
MIND and the Collaboratory need your support. To make upcoming issues a reality, financial support to our non-profit organization is essential. Talk to us if your company or organization would like to support this publication. Donations are tax deductible, sponsorships or ad placement are also an option.
Questions? Contact Lorena Jaume-Palasí at lorena@collaboratory.de
The time for action is now. The future of the internet is at stake and few other platforms bring together experts of such caliber to make the debate about its governance accessible to the wider community. It is the Collaboratory’s stated mission to provide a platform for constructive ideas on how our societies can cope best with the digitization of our world. Support this open platform, an essential, non-partisan component of the German internet policy ecosystem. There are many ways to contribute to the realization of this project: R5 become a cooperation partner or official supporter of the Collaboratory R5 support MIND with a donation or by covering some of its cost R5 support MIND through sponsorship or by buying ad space R5 become a distribution partner R5 host the magazine’s launch or press event
71
Previous Issues and Authors of MIND BERLIN - JUNI 2013
BERLIN – BALI – OCTOBER 2013
CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1
#5
Internet and Security
e te
P s - Globa l Net work rook n B - World Press Freedo Initiat m my n i Jer Kove id Ja mil - ICC Pak ist Com ve h an mi a ld Z a t
RIVATE SECTOR
on
zi l
R
Th O or lg
ATE SECTOR PRIV
Br a
r ug
PEF ,
Mohan - CTO, A fi R am hha r ia - Internet lias C S ssociation of er v ice j e sh Ind R a rov ider A ia P
S el
Shirin Ebadi, 2003 Nobel Peace Prize, Iran R M aú l Ec ar
CHO TECH N IS C H E C
O CI V I L S CI E T Y
el a
GERMAN ISSUE
Er
ac i
ru
i rk
Gr
r.
M
- A PC New Z d ic o a t e L id lcolm - Consu a land J oy r e m y M a m Je nternationa l, Ma laysia ers I a i me n - N U
n exa r ia n ust In der K l i m bu rg - A s
X st i ir fo T E u Pe d To d oit ute for Foreig n A ff a e nte r h ina Y C H i x i - r o w - C o o r d i n a t i o n C y of C I T N N I C C om mu n r s it MU ic at io n U n i v e AL & AC A D E M I C C O M
ENT & PARLIA
VE inister for Foreign A ff EN GOildt - M chaake - European P airs, S T w ar S B rl iet je y ua - Government o liame eden f K nt C a M a r e Mu n en ic ya Al
la Wi l le - M K a ro h - A K DR Vorra r is c tW as K ien
Al
Julian Nida-Rümelin
Staatsminister a.D., Ludwig-Maximilians-Universität München d lan Kr sc h is e ut e n i c S c he no w Y sk i - I S O C D ät W i chw IT AD t eig h iv er si EM UN Er ik öfe r - Un N N a M a nn - ICA IS MM
D
AK
RN M
d re
on
Human Rights and Internet Governance
iz
L ILGESE LSCHA ZIV laus Stoll - NPOC FT K
Bruce Schneier
Bu hneider - BA KOM iz c Sc h as S we
om
An
O CI V I L S CI E T Y
Non-Commerc ia l U or i a ri D nc y, ICA NN s Av onstit ue o - N U NCUC er PEF, C s A fon s Bra rlo zi l Ca
President of the Republic of Estonia
Author of Liars and Outliers: Enabling the Trust Society Needs to Thrive
Le
Th
pe ro na i
Toomas Hendrik Ilves
#4
M E N T & R EG I E R RL A theuser-Schna r r UNG
PA ne Leu ministerin der enberg er Just bi ndes -
Sa
M
VE d - Secretar y General, C EN GOaglan GAC, Foreign Ministr ouncil T yo n J li f A of E ør a l rg u bj C av en t a
TSEK TOR PRIVA
ENT & PARLIA
MULTISTAKEHOLDER INTERNET DIALOG
CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1
Internet und Demokratie
s - Unit y media K ab s t h au lf O pp Grabensee - A fi l elBW ia s Wo Phili
RN M
MIND
MULTISTAKEHOLDER INTERNET DIALOG
CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1
MULTISTAKEHOLDER INTERNET DIALOG
#6
BERLIN - BAKU - NOVEMBER 2012
MIND
MIND
y ua C O
k h U S T E C e e u s K u e b e r r í a - L AC N I C , l e r - I a m Y C H s H a m m e r & N i c o l a s S e i d s ter d I T N N I C m el i n k Am MU U n i v e r s it y o f AL & AC A D E M I C C O M suppor ted by:
Eine Publikation des Internet & Gesellschaft Collaboratory e.V.
Herausgeber - Wolfgang Kleinwächter A publication by the Internet & Society Collaboratory
Editor - Wolfgang Kleinwächter
A publication by the Internet & Society Co:llaboratory
Internet & Society
Co llaboratory
Editor - Wolfgang Kleinwächter
Internet & Society
Co llaboratory
MIND #6 – INTERNET AND SECURITY
MIND #5 – INTERNET UND DEMOKRATIE
MIND #4 – HUMAN RIGHTS AND INTERNET GOVERNANCE
Cybersecurity is as important as the openness and freedom of the Internet. An insecure cyberspace undermines individual human rights, blocks online business and hinders the free exchange of information. But there is still no globally accepted definition of what Internet security – or, more broadly, cybersecurity – means in detail. Different stakeholders have different ideas. This paper will make a contribution to this topic.
The internet has evolved from the technical playground of a few scientists to the operating system of our global society. It is now the platform on which major value creation takes place. This raises several questions: Does the internet better our democracy? Is access to the internet a basic right? What does that mean for politics? MIND #5 deals with these questions.
This volume focuses on the struggle for freedom of speech and human rights on the internet, an area which is – most recently since the Arab Spring – at the foundation of today’s discourse. More and more actors are seizing the opportunity to shape the global network according to their respective interests and value systems.
Proposition
Proposition
Proposition
R5 Toomas Hendrik Ilves - President of the Republic of Estonia R5 Bruce Schneier - Author of Liars and Outliers: Enabling the Trust Society Needs to Thrive Government & Parliament R5 Thorbjørn Jagland - Secretary General, Council of Europe R5 Olga Cavalli - GAC, Foreign Ministry of Argentina
Private Sector R5 Ram Mohan - CTO, Afilias R5 Rajesh Chharia - Internet Service Provider Association for India Civil Society R5 Avri Doria - Non-Commericial User Constituency, ICANN NCUC R5 Carlos Afonso - NUPEF, Brazil
Technical & Academic Community R5 Alexander Klimburg - Austrian Institute for Foreign Affairs R5 Leonid Todorov - Coordination Center for .ru R5 Xu Peixi - Communication University of China
R5 Julian Nida-Rümelin Staatsminister a.D., LudwigMaximilians-Universität München
Government & Parliament R5 Sabine Leutheusser-Schnarrenberger - Federal Ministry of Justice, Germany R5 Thomas Schneider - Federal Office of Communications (OFCOM), Switzerland Private Sector R5 Wolf Osthaus - UnitymediaKabelBW R5 Philipp Grabensee - Afilias
Civil Society R5 Klaus-Dieter Stoll - ICANN Not-for-Profit Organizations Constituency (NPOC) R5 Karola Wille - Central German Broadcasting, MDR R5 Andreas Krisch - Working Group on Data Retention, Vienna Technical & Academic Community R5 Dirk Krischenowski - Internet Society (ISOC) Germany R5 Erich Schweighofer - Unversity of Vienna R5 Erika Mann - Internet Corporation for Assigned Names and Numbers (ICANN), Facebook
72
R5 Shirin Ebadi, Nobel Peace Price Iran Government & Parliament R5 Carl Bildt - Minister of Foreign Affairs, Sweden R5 Marietje Schaake - European Parliament R5 Alice Munyua - Government of Keny
Private Sector R5 Jermyn Brooks - Global Network Initiative R5 Ronald Koven - World Press Freedom Committee R5 Zahid Jamil - ICC Pakistan Civil Society R5 Joy Liddicoat - APC New Zealand R5 Jeremy Malcolm - Consumers International, Malaysia R5 Graciela Selaimen - NUPEE, Brazil
Technical & Academic Community R5 Raúl Echeberría - LACNIC, Uruguay R5 Markus Kummer & Nicolas Seidler - ISOC R5 Cees J. Hamelink - University of Amsterdam
PREVIOUS ISSUES AND AUTHORS OF MIND
commented by several stakeholders from academia and the technical communities, the private sector, as well as civil society and government in form of replies. It is a Creative Commons BY (attribution) licensed volume and freely available to anyone.
MIND stands for Multistakeholder Internet Dialogue. This discussion paper series is a platform for modern polemics in the field of internet governance. Each issue is structured around a central argument in form of a proposition of a well-known author, which is then
BERLIN · NAIROBI · SEPTEMBER 2011
#2
RNM
n
li
Y
Siv
F
PR an Stöcker, Spiegel T O i r ist Verband der nline de Ch el Roter t, a t w irtschaft (eco utsche ich Interne ) n
I RT S CH A
IVAT W
SECTOR
ATE PRIV
TE
Wilhelms-Universität Münster
M
S
Ji m
TSEKTOR PRIVA
Ch
er
Bernd Holznagel + Pascal Schumacher
E S E LL S C H A F
B zu ität lo
en Bundestag d des halt om Th Deutsch es
VILG
vers
V e , a s u i nt C e r f, G o o gl s a m y br a m a n ia n M ut hu
AK
er n To ic e , H
Bertrand de La Chapelle Program Director, International Diplomatic Academy, Paris
ge
ZI reude, Internet-Enqu T e ar F destages und A te des A lv hen Bun K c r ichter, ICA N N Zen Hofe uts /A L sur De andra AC S
GERMAN ISSUE
L ILGESE LSCHAFT
fP
E NT U N D R EG I E R AM RL b, Justizministerin Sach UN PA la Kol as Ja rzombek , Mitglie sen-An G
An
O CI V I L S CI E T Y
ZIV
ot h, Human R ig t h R g, ICA NN hts Wa E t n ne wi K e l f L ud a n B a h l s , M U R A L c h OGI O Wo Chr isti S
teck , TelefÓnica E op h S uro r i s t O l i v e r S ü m e , e co pe
s
ol
M i s u A D at t h b i a s M a h mb oldt-Unsit y of O Graz NIT E M ia s C . le r, Un i v e r versität M U I S C K e t t e m a n n , Un i OM H -T E C H N IS C H E C
Grundrecht Internetfreiheit
M
épin-Leblond, I J. Cr rhuysen, A CA N N s so M. te ier et te Es ve Commu ciation A L A ni i si l iv O A nr rogres Mü hlber cations for C g , ve P n e t te r.di An
ag
I
ng
AK
ENT & PARLIA
#1
VE o, Ministr y of Foreign EN GO Lucer er, U. S. Department A ffairs T d o , rautmann, Mem f Com Bra ton xan e r A l e r i ne T z b Ev na athe European Parliame er of mer il ce o C the nt Fi
s
Universität Zürich
CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1
Internet Policy Making
E NT U N D R EG I E R
RL , Mitglied des Europäis UN PArheyen imer, Bundeskanzleramchen P G ar e t ra itglied des Deutsche Öste lam eV sT n B rre en i n h ia l z , M un ich t ab att chu de M yS st m
Rolf H. Weber,
MULTISTAKEHOLDER INTERNET DIALOG
CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1
Grenzen der Internetfreiheit AM
MIND
MULTISTAKEHOLDER INTERNET DIALOG
ha r t, Verizon Comm un w i ne k sa S onds, No ia Siemens N icatio ere el l m n e ga , Computer Th er H o Sigan Soc t wor s d t of Kenya iet ks y Pe Wau
#3
BERLIN, IM MAI 2011
MIND
MULTISTAKEHOLDER INTERNET DIALOG
CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1
AD
C H W i l l i I S O C I n d i a – C h e n n a i u r ic h N I T Y fZ N IC a m D r a ke U , Un iversit y o M M AL & AC A D E M I C C O
Wol
ek ,
E M H a a rl-Fra nf g a n g B ene dität Gra ny UN I S C ns Pete z ens-Un ivers G er ma M M O r D i t t l e r, I S O C HE EC UND TEC H NIS C H K
z
GERMAN ISSUE
TY
MIND
I
suppor ted by:
Eine Publikation des Internet & Gesellschaft Co:llaboratory
Herausgeber - Wolfgang Kleinwächter
Eine Publikation des Internet & Gesellschaft Co:llaboratory.
A publication by the Internet & Society Co:llaboratory
Herausgeber · Wolfgang Kleinwächter
Editor · Wolfgang Kleinwächter
MIND #3 –GRENZEN DER INTERNETFREIHEIT
MIND #2 – INTERNET POLICY MAKING
MIND #1 – GRUNDRECHT INTERNETFREIHEIT
Who decides what information harms national security and what doesn’t? In which ways do expressions of opinion disturb public order? How can individual communication result in a catastrophe for the system that protects intellectual property? Who is in charge of judging this? A government, a company, a party, a lobby group or internet users? This issue remains controversial, especially from the perspective of different states.
With this edition, we want to jump-start the wider debate on multistakeholder governance. This seemingly technical issue has important ramifications for the future of our societies and our planet. Only when we find modes of governance that allow us to address the technical and philosophical challenges of our complex and interdependent online and offline lives will we be able to secure the future of humanity.
The internet is a technology of freedom. It is a liberating medium. Never in human history have individuals been able to move as freely as they can on the internet, where time and space disappear. How are these liberties guaranteed when individuals and companies ask for universal human rights?
Proposition
Proposition
Proposition
Government & Parliament R5 Fiona Alexander - U.S. Departement of Commerce R5 Catherine Trautmann - Member of the European Parliament R5 Everton Lucero - Ministry of Foreign Affairs, Brazil
Government & Parliament R5 Angela Kolb, Ministry of Justice, Saxony-Anhalt R5 Thomas Jarzombek, Member of the German Bundestag
R5 Rolf H. Weber, University of Zurich
Government & Parliament R5 Matthias Traimer - Federal Chancellery of the Republic of Austria, Vienna R5 Sabine Verheyen - Member of the European Parliament R5 Jimmy Schulz - Federal Minister, Germany Private Sector R5 Christoph Steck - Telefónica Europe R5 Oliver Süme - eco e.V.
Civil Society R5 Wolf Ludwig - ICANN R5 Christian Bahls - MOGIS e.V. R5 Kenneth Roth - Human Rights Watch
Technical & Academic Community R5 Ingolf Pernice - Humboldt University of Berlin R5 Tobias Mahler - University of Oslo R5 Matthias C. Kettemann - University of Graz
R5 Bertrand de La Chapelle - Program Director, International Diplomatic Academy, Paris
Private Sector R5 Theresa Swinehart - Verizon Communications R5 Peter Hellmonds - Nokia Siemens Networks R5 Waudo Siganga - Computer Society of Kenya Civil Society R5 Anriette Esterhuysen - Association for Progressive Communications R5 Olivier M. J. Crépin-Leblond - ICANN / ALAC R5 Annette Mühlberg - ver.di
Technical & Academic Community R5 William Drake - University of Zurich R5 Vint Cerf - Google R5 Sivasubramanian Muthusamy - ISOC India, Chennai
73
R5 Bernd Holznagel & Pascal Schumacher, University of Münster
Private Sector R5 Christian Stöcker, Spiegel Online R5 Michael Rotert, Association of the German Internet Industry
Civil Society R5 Alvar Freude,Internet-inquiry of the German Bundestag, working group censorship R5 Sandra Hoferichter, ICANN / ALAC Technical & Academic Community R5 Hans Peter Dittler, ISOC Germany R5 Wolfgang Benedek, University of Graz
Imprint MIND #7 - Privacy and Internet Governance Editor Wolfgang Kleinwächter
Layout & Design: Jan Illmann
,
Production Janina Gera Sebastian Haselbeck
Original design concept of the series Jessica Louis & Sabine Grosser www.louisgrosser.com
Editorial Board Prof. Wolfgang Kleinwächter, Department for Media and Information Studies at the University of Aarhus (Chair) Prof. Wolfgang Benedek, Institute for International Law and International Relations, Karl-Franzens Universität Graz Prof. Rafael Capurro, International Center for Information Ethics (ICIE), Karlsruhe Dr. William J. Drake, Institute of Mass Communication and Media Research, the University of Zurich. Prof. Dr. Jeanette Hofmann, Social Science Research Center (WZB), Alexander v. Humboldt Institute for Internet and Society (HIIG) Berlin Prof. Bernd Holznagel, Institute for Telecommunication and Medialaw at the University of Münster Prof. Divina Meigs, Université Sorbonne Nouvelle, Paris Prof. Milton Mueller, Institute for International Studies at the University of Syracuse, N. Y. Dr. Philipp S. Müller, Center for Public Management and Governance, SMBS, Paris-Lodron University Salzburg Prof. Michael Rotert, Institute for Informatics, Karlsruhe University of Applied Sciences Prof. Rolf Weber, Law Faculty of the University of Zurich
Printed by Oktoberdruck, Berlin
Unless stated otherwise, all texts are published under a Creative Commons Attribution 4.0 International (CC BY 4.0) license. You are free to share, copy and redistribute the material in any medium or format, adapt, remix, transform, and build upon the material for any purpose, even commercially – under the condition of attribution.
Cover Image: teachandlearn | https://flic. kr/p/5qK9PG | CC BY-NC-SA 2.0 | https://creativecommons.org/licenses/ by-nc-sa/2.0/ Contact the Collaboratory or its board Dr. Michael Littger, Martin G. Löhe, Lena-Sophie Müller, Dr. Philipp S. Müller, Dr. Marianne Wulff kontakt@collaboratory.de More information about the organization, the people, projects, current partners and financial structure is on our website. Our platform relies on third party funding, consider supporting the Collaboratory with a donation. We would love to talk to you about a possible cooperation. Visit us at www.collaboratory.de
74
More information: creativecommons. org/licenses/by/4.0/legalcode Kleinwächter, Wolfgang (ed.). “Privacy and Internet Governance”. MIND Multistakeholder Internet Dialog #7. Collaboratory Discussion Paper Series No.1, Internet & Society Collaboratory www.collaboratory.de – Berlin: June 2014 ISBN 978-3-00-046186-6
Internet & Society
Co llaboratory
MIND is a multi stakeholder debate magazine on interdisciplinary challenges of internet
governance. It is edited by Wolfgang Kleinwächter and published twice per year coinciding with the international or regional internet governance forum.
Future issues depend on your support! We are looking for funding and distribution partners to
make the next issue a reality. Position your company or organization as enabler of this essential
discourse, reach international decision makers, and support the Discussion Paper Series with a donation or sponsoring.
The Internet & Society Collaboratory is an open think tank and internet policy deliberation
platform dedicated to enabling the interdisciplinary work of specialists from civil society, academia, the public and private sectors on solutions to tomorrowâ&#x20AC;&#x2122;s socio- political opportunities and
challenges posed by the digital transformation at the intersection between the internet and society. Contact us via international@collaboratory.de if you would like more information, if you require additional print copies of this or past issues, or if you are interested in supporting or participating in our projects.
The Collaboratory was initiated in 2010 by Google Germany, since 2012 it is an independent, nonprofit organization based in Berlin. For more information on the Collaboratory, our projects and activities, our funding and participating experts please visit collaboratory.de
Visit the Internet & Society Collaboratory at: http://en.collaboratory.de
ISBN 978-3-00-046186-6
9 783000 461866