The mutual shaping of law, technology, norms, and society The example of robotics
Prof. Bert-Jaap Koops TILT - Tilburg Institute for Law, Technology, and Society e.j.koops@uvt.nl http://www.robolaw.eu
10 Dimensions of Technology Regulation
source: Koops, B.J. (2010), ‘Ten dimensions of technology regulation. Finding your bearings in the research space of an emerging discipline’, in: M.E.A. Goodwin et al. (eds), Dimensions of Technology Regulation, Nijmegen: WLP, p. 309-324, http://ssrn.com/abstract=1633985
2
5. Regulation Type tool-box
appearance
bindingness
• • • •
• • • • • •
• • • • •
law social norms market architecture
regulatory pitch • • •
command-andcontrol soft sister pragmatic / rational
regulatory range • • •
negative (stick) positive (carrot) neutral
constitutional rule statutory rule (civil law) precedent (common law) contractual term General Terms & Conditions code of practice/conduct
legal area • • • • • • • • • • • • •
constitutional law public international law criminal law administrative law environmental law contract law tort law intellectual-property law private international law labour law disability law health law etc etc
Koops-Dimensions of Technology Regulation
fundamental rights statutory legislation case-law soft law non-binding rule
legal tradition • •
common law civil law
actor • • • • • • • •
legislator public executive body (quango) NGO standardisation body business consumer organisation patient organisation public-private partnership 3
6. Normative Outlook ethical school
fundamental values
•
utilitarianism
•
Kantianism
•
communitarianism
•
ethics of the good life
• • • • • • •
human rights
autonomy liberty accountability / responsibility privacy human dignity equality etc
•
privacy
fundamental concepts
•
non-discrimination
•
freedom of thought
•
right to health
• • • •
•
etc
property personhood integrity etc
risk attitude • •
risk-averse risk-tolerant
•
uncertainty tolerance
Koops-Dimensions of Technology Regulation
4
9. Problem regulation purpose
research aim
•
protection of vulnerable people
•
enhancing understanding
•
ordering socio-economic relations
•
solving a practical problem
•
providing legal certainty
•
solving a theoretical problem
•
incentivising innovation
•
feeding policy
•
distributing responsibilities
•
guiding policy
research methods
problem definition
•
desk research
•
descriptive
•
normative legal analysis
•
normative
•
comparative legal research
•
exploratory
•
case-law survey
•
hypothesis testing
•
interviews
•
case studies
•
survey
Koops-Dimensions of Technology Regulation
5
Co-evolution of technology and society
Technology Â
Society Â
6 November 2013
6
The role of the law
Law
Technology
6 November 2013
Society
7
The TILT mutual shaping perspective Regula3on
Technology developments
Norma3ve outlooks
The interplay between regulation, technology, and normative outlooks [from: TILT Research Programme 2009-2013, v. 1.01]
6 November 2013
8
The TILTed mutual shaping perspective Technology developments
Regula3on
Norma3ve outlooks
The interplay between regulation, technology, and normative outlooks [adapted from: TILT Research Programme 2009-2013, v. 1.01]
6 November 2013
9
The TILT(s) mutual shaping perspective Technology developments
society Regula3on
Norma3ve outlooks
The interplay between regulation, technology, and normative outlooks [adapted from: TILT Research Programme 2009-2013, v. 1.01]
6 November 2013
10
How to apply this TILT triangle?
An example: robotics and human rights
6 November 2013
11
Robo3cs
Human-‐rights law
6 November 2013
Human-‐rights theory Norma3ve assump3ons on humans and rights
12
1. Application level Robo3cs
Human-‐rights law
2
Human-‐rights theory Norma3ve assump3ons on humans and rights
Figure 1. The application level of human-rights implications of robotics
6 November 2013
13
2. Assessment level Robotics
2a 2b
Human-rights law
Human-rights theory Normative assumptions on humans and rights
Figure 2. The assessment level of human-rights implications of robotics
6 November 2013
14
3. Reflection level Robo3cs
2a 2b
Human-‐rights law
Human-‐rights theory Norma3ve assump3ons on humans and rights
Figure 3. The reflection level of human-rights implications of robotics
6 November 2013
15
Mutual shaping in robolaw
Robo3cs
2a assess 2b
Human-‐rights law
Human-‐rights theory Norma3ve assump3ons on humans and rights
Figure 4. Three levels of human-rights implications of robotics
6 November 2013
16
(1) Examples of questions at the application level Surveillance drone
Visual implant
Robotics in general
Privacy
Nondiscrimination
Human rights in general
6 November 2013
17
(1) Examples of questions at the application level Privacy
Nondiscrimination
Surveillance drone Is use of a surveillance drone compatible with the right to privacy?
Visual implant Robotics in general Do visual implants Can robots be designed in make recordings of a privacy-compliant way? signals they receive, and if so, is there a legitimate basis for this?
Are surveillance drones used in a discriminatory way, e.g., only monitoring areas in which mostly migrants live?
Can a visually impaired person be rejected for a job if she refuses to take a visual implant?
Human rights in What are the humangeneral rights implications of surveillance drones under the ECHR?
6 November 2013
Is equal access to new robotics applications ensured?
What are the How are robotics regulated human-rights under the Italian implications of Constitution? visual implants used for non-medical purposes?
18
[reminder: 2. Assessment level] Robotics
2a 2b
Human-rights law
Human-rights theory Normative assumptions on humans and rights
Figure 2. The assessment level of human-rights implications of robotics
6 November 2013
19
(2a) Examples of questions at the level of assessment of robotics human rights law & theory
Surveillance drone
Robotics in general
New robots
Privacy
How should surveillance drones be designed to make them privacycompliant (“privacy by design”)?
Do robot developers have a duty to build in safeguards in robots to prevent privacy violations?
Can physical robots or softbots be developed to stop the gradual erosion of privacy in physical and digital environments?
Nondiscrimination
Should use of surveillance drones by employers to monitor employees on factory premises be prohibited or otherwise regulated?
Will the development of carebots lead to discrimination of elderly people who cannot afford to buy human care?
Can robots be developed for use at customs control that make ‘colourblind’ risk assessments about which passengers should be investigated?
What are the major threats that robotics pose for the protection of human rights?
Are there opportunities in robotics to better protect human rights?
Human rights in Under which conditions general should governments be allowed to use surveillance drones, in light of human rights?
6 November 2013
20
(2b) Examples of questions at the level of assessment of human rights human rights
Surveillance drone
Robotics in general
Privacy
Is existing law on making photographs in public places adequate in light of developments in drones?
Is the reasonable expectation of privacy affected by developments in robotics, and if so, how should we evaluate that?
Non-discrimination
Should we adapt our interpretation of fair treatment in case surveillance drones allow us to discover new distinctions between groups of people?
Should we worry about nonenhanced humans being discriminated against in favour of enhanced humans?
Human rights in general
Do criminal-procedure law and administrative-procedure law provide sufficient safeguards for human-rights protection if the government starts using surveillance drones on a large scale?
Is the European Charter of Fundamental Rights up-to-date in light of developments in robotics?
New human rights
Do we need a constitutional ‘right to be forgotten’ if surveillance drones would allow the ubiquitous tracking and publishing online of individuals’ movements?
Should we introduce a right to nonenhancement, or a right to imperfection, if robotics put pressure on people to enhance themselves?
6 November 2013
21
(3a) Examples of questions at the level of reflection on robotics robots Surveillance drone
Robotics in general
normative outlooks Normative framework
Does a human-rights analysis of surveillance drones lead to different regulatory outcomes than a utilitarian analysis?
Which robotics inventions fit in an ethics of the good life, and which regulatory implications does this have?
Human-rights theory
Which positive obligations does the state have to regulate use of surveillance drones in horizontal relationships?
Can a duty to embrace “valuesensitive design� in robotics be grounded in human-rights arguments?
Fundamental values: Autonomy Human dignity ... ...
Do surveillance drones have a panoptic effect and if so, does this threaten the autonomy of citizens?
Which guidelines can be given for the development and use of robots so that they are compatible with, and where possible enhance, the value of human dignity?
6 November 2013
22
(3b) Examples of questions at the level of reflection on human rights robots Surveillance drone normative outlooks
Robotics in general
Normative framework
Is the “security versus liberty” frame fruitful to assess surveillance drones, or should a different normative frame be used?
Can we write a white paper on robotics regulation in Europe solely based on the normative framework of human rights, or should we also embrace utilitarian and communitarian perspectives on robotics?
Human-rights theory
Can the horizontal protection of privacy still be adequately regulated through the notion of positive state obligations, if anyone can buy surveillance drones off-the-shelf to spy on neighbours?
Can robots at some point become a bearer of human (or fundamental) rights, and which criteria should we apply to make such an assessment?
Fundamental values: Autonomy Human dignity ... ...
Should we cling to 20th-century interpretations of autonomy if 21stcentury society effectively becomes a Panopticon through, inter alia, surveillance drones?
How should we interpret the notion of “integrity of the person” if human-computer interfaces connect human brains to the Internet?
6 November 2013
23
The challenge of mutual shaping (1) •
•
if law, technology, and normative outlooks co-evolve, how can we discuss regulation of emerging technologies? – how do we know how technology is going to evolve? • cf. Collingridge dilemma – when should we adapt technology to the law, and when adapt the law to technology? – how can we make normative judgements for future generations, who may have different values and outlooks? we can start with current technologies and short-term futures – assume normative outlooks are stable – focus on the mutual shaping of law and technology – balance technology neutrality and legal certainty – use Technology Assessment and Privacy/Regulatory Impact Assessment tools – experiment with technology and/or law cf. Mark Coeckelbergh, Human Being @ Risk. Enhancement, Technology, and the Evaluation of Vulnerability Transformations (Springer, 2013)
6 November 2013
24
The challenge of mutual shaping (2) how do we deal with the longer-term future? •
•
•
consider which normative outlooks we desire to be stable – which fundamental values? which human rights? which interpretation of these? use our imagination – design socio-technical scenarios – discuss the normative ‘look and feel’ of these scenarios – (science) fiction is an important tool to trigger our imagination ‘smart regulation’ – flexible and self-learning – anticipate, monitor, evaluate, adapt
6 November 2013
25
Thank you for your attention!
e.j.koops@uvt.nl
6 November 2013
26