Human Rights Defender Volume 27: Issue 3

Page 1

HUMAN RIGHTS DEFENDER

Australian Human Rights Institute

#RethinkRecycleRevive

VON WONG

Reflections on the UN Consultation on Big Data-Open Data

Putting human values into the machine

JOSEPH CANNATACI

EDWARD SANTOW

SPECIAL ISSUE: HUMAN RIGHTS AND TECHNOLOGY HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018


PAGE 02

MANAGING EDITORS:

AUSTRALIAN HUMAN RIGHTS INSTITUTE Website: www.humanrights.unsw.edu.au Email: humanrights@unsw.edu.au Twitter: @humanrightsUNSW LinkedIn: Australian Human Rights Institute

DR CLAIRE HIGGINS is a Senior Research Associate at the Andrew and Renata Kaldor Centre for International Refugee Law, at UNSW Sydney. She is the author of ‘Asylum by Boat: origins of Australia’s refugee policy’ (NewSouth, 2017) and was a Fulbright Postdoctoral Scholar at Georgetown University, Washington DC, in 2018. Claire is the Editor-in-Chief for the Human Rights Defender. DR NOAM PELEG is a Lecturer at UNSW Law, where he teaches international children’s rights law and family law. Noam is the book review editor and member of the editorial board of the International Journal of Children’s Rights.

DR PICHAMON YEOPHANTONG is an Australian Research Council DECRA Fellow and Senior Lecturer in International Relations and Development in the School of Humanities and Social Sciences, UNSW Canberra at the Australian Defence Force Academy.

GUEST AND STUDENT EDITORS: GABRIELLE DUNLEVY is the Communications Officer at the Australian Human Rights Institute at UNSW Sydney. Before working at UNSW Sydney, Gabrielle reported on politics, justice, foreign affairs and trade as a foreign correspondent based in Jakarta. Her journalism has been published in many Australian media outlets as well as publications in Southeast Asia. LAURA MELROSE is currently completing a dual Bachelor of Laws and International Relations at UNSW Sydney. As well as editing Human Rights Defender, she is engaged in a volunteer research project with the UNSW Human Rights Clinic on the housing rights of international students. She aspires to a career in human rights law and policy, with particular interest in the rights of Indigenous Australians and refugees and asylum seekers.

PHOTOS Cover and Contents Image by ©Von Wong: www.vonwong.com The Human Rights Defender thanks Von Wong, Richard Eves, Sarah Holmlund, Marco Saroldi, tlorna and Shutterstock for their photographic contributions to this edition, and Stephanie Kay at On The Farm Creative Services for graphic design of the publication this year.

CORRECTION

© 2018 Human Rights Defender. The views expressed herein are those of the authors. TheAustralian Human Rights Institute accepts no liability for any comments or errors of fact. Copyright of articles is reserved by the Human Rights Defender. ISSN 1039-2637 CRICOS Provider Code. 00098G

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

The previous issue of Human Rights Defender, “Refugee Voices,” featured the wrong numeration. It should have read Volume 27 Issue 2. We apologise for the production error.


PAGE 03

04

Editorial

06

E-waste: Enough is e-nough

08

#RethinkRecycleRevive

10

Reflections on the UN Consultation on ‘Big Data – Open Data’

Gabrielle Dunlevy and Laura Melrose

Laura Melrose

Von Wong

Joseph Cannataci and Elizabeth Coombs

13

Putting human values into the machine

15

High-tech heat: Law enforcement in the Fourth Industrial Revolution

Edward Santow

Laura Melrose

18

Access all areas? Telecommunications and human rights in Papua New Guinea Sarah Logan and Miranda Forsyth

21

Understanding Aadhaar: The Unique Identification Authority of India and its challenges Smriti Singh

25

Human rights drowning in the data pool: identity-matching and automated decisionmaking in Australia Kerry Weste and Tamsin Clarke

CONTENTS HUMAN RIGHTS DEFENDER VOLUME 27: ISSUE 3 – NOVEMBER 2018


PAGE 04

EDITORIAL EDITORIAL BY GABRIELLE DUNLEVY AND LAURA MELROSE

From the nimble fingers of young women who work on production lines until physical exhaustion,1 to the calloused hands of boys scavenging in rubbish heaps for discarded value, our technological devices pass from hand to hand numerous times over their lifespan. At the same time as our relationship with technology becomes more intimate, we find the need to step back and examine the effects on human rights. In recognition of these growing challenges, Sydney hosted two major consultations this year on human rights and technology. In July the UN Special Rapporteur on the Right to Privacy, Professor Joe Cannataci, visited Australia to consult before his report into Big Data and Open Data. Participants discussed the vast stores of information that exist about each of us, which can be mined for insights and potentially released in the name of open government. In this edition of the Defender, Prof Cannataci writes that stronger privacy protections must stem from a human rights motivation. Human Rights Commissioner Ed Santow similarly calls for ‘Putting human values into the machine’ in his article for this issue, recognising the incredible opportunities that technological innovation presents but also warning about the consequences of removing human decision making and accountability. This year, the Australian Human Rights Commission held a major consultation 2 not only on privacy but also about other issues, including how technology can be harnessed to help people with disabilities.

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

SO WHY THE SPOTLIGHT ON RIGHTS AND TECH THIS YEAR? In the area of privacy, one motivator has been the EU legislation known as the GDPR. Remember that flood of emails you received back in June where companies asked you to confirm your spot on their mailing lists? The General Data Protection Regulation 3 is a landmark framework for corporations on responsibly gathering and managing customer data. Undoubtedly the biggest story in tech this year was Cambridge Analytica,4 the data analytics firm that harvested from more than 50 million unknowing Facebook users in order to target US voters with personalised political advertising. It collected its data through an app called thisisyourdigitallife, a personality test. Facebook had known about this monumental data breach since 2015. Another privacy-related question mark still lingers over what extent Russian-based groups managed to influence the US election in 2016. The lack of regulation in online advertising created the space for targeted Facebook ads to divide the population and push Trump’s candidacy, many of which could be traced back to the Russian propaganda group, Internet Research Agency.5 Facebook is an important outlet for anti-corruption groups and other activists in Papua New Guinea, as Sarah Logan and Miranda Forsythe write in this issue. There, the government threatened to cut access to Facebook, but were unable to enforce the ban on the telco Digicel. The experience raises questions about the enjoyment of human rights where corporations dominate the digital landscape of developing markets. As this issue goes to print, Facebook is advertising for a human rights consultant6 to join its team, “to coordinate our company-wide effort to address human rights abuses, including by both state and non-state actors”.


PAGE 05

OUR RIGHTS ARE NOT ONLY BEING AFFECTED IN THE ONLINE SPACE. Kerry Weste and Dr Tamsin Clarke write on two bills before Australian lawmakers that raise concerns regarding the potential for authorities to use identity-matching and automated decision-making in ways that amount to substantial privacy infringements. The collection of biodata by governments has also caused great alarm to civil libertarians in India. In this issue, Smriti Singh writes about Aadhaar, the universal identification scheme that was promised to ease access to basic human rights like education and health for millions of Indians, but at the same time raises fears about the unprecedented ability to monitor citizens. As new technologies develop, so too do the laws that exist to govern them – in theory. In an article examining the use of technologies in law enforcement, Laura Melrose explores some of the ways in which police and government are using robots and artificial intelligence in their work, and the human rights implications of non-human governance. Finally, this issue asks us to consider what happens when technology no longer serves us. Our spectacular cover image comes from the photo artist Von Wong, who is concerned with our diminishing natural environment. Electronic waste is the fastest

growing category of household waste, and Von Wong wants us to treat recycling with the same enthusiasm that tech devotees display for new releases of their favourite smartphones. The cost to human health and safety from irresponsible disposal of gadgets is severe, as Laura Melrose writes in the piece alongside Von Wong’s images.

Like us, many of our readers work and study in universities. Research institutions pay a critical role in developing responses to human rights threats and challenges – whether they be legal developments or material innovations. A future edition of the Defender will focus on new solutions emerging from labs and leading minds around the world. It’s fitting that this tech issue of Human Rights Defender is the first to be available online. The potential to tell human rights stories in new ways, with greater interactivity, is exciting. Readers can visit the Australian Human Rights Institute website and subscribe to be notified when new issues are available.

1. China Labor Watch (2018), ‘Amazon profits from secretly oppressing its supplier’s workers: An investigative report on Hengyang Foxxconn,’ report, 10 June 2018 <http://www.chinalaborwatch.org/report/132> 2. Australian Human Rights Commission (2018) ‘Human Rights and Technology,’ <https://tech.humanrights.gov.au/> 3. EU GDPR.ORG (2018) <https://eugdpr.org/> 4. C Cadwalladr and E Graham-Harrison (2018) ‘Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach,’ The Guardian, 18 March 2018, <https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-uselection> 5. B Barrett (2018) ‘For Russia, unraveling US democracy was just another day Job,’ Wired, 17 February 2018, <https://www.wired.com/story/ mueller-indictment-internet-research-agency/> 6. Facebook (2018) <https://www.facebook.com/careers/v2/jobs/325820461297007/>


PAGE 06

E-WASTE: ENOUGH IS E-NOUGH LAURA MELROSE Laura Melrose is currently completing a dual Bachelor of Laws and International Relations at UNSW Sydney. As well as editing Human Rights Defender, she is engaged in a volunteer research project with the UNSW Human Rights Clinic on the housing rights of international students. She aspires to a career in human rights law and policy, with particular interest in the rights of Indigenous Australians and refugees and asylum seekers.

Consumer demand for personal electronics such as phones, tablets, laptops and is at an all-time high. With this thirst for new innovations comes a high degree of obsolescence for ‘old’ models, resulting in a worldwide increase in e-waste. In 2016, global dumps of electronic waste, or e-waste, totalled 44.7 million metric tonnes, and less than 20 per cent of that was properly recycled.1 E-waste comprises anything that is powered by a battery or an electrical plug. It is often rich in valuable materials like gold, copper, and palladium, and in 2016, the UN estimated the value of the world’s e-waste to be US$55 billion – more than the GDP of most countries.2 In addition to valuable metals, e-waste is also high in toxic chemicals such as arsenic, mercury and lead. When these materials aren’t disposed of properly, they can create a host of problems for surrounding communities and environments like toxins leaching into soil and pollution of air and water.3 In large e-waste graveyards, discarded devices are disposed of as landfill and left for scavengers to pick over, despite the many health risks. Ghana is home to the largest e-waste scrapyard in the world, nicknamed Agbogbloshie. Around 700 people, some as young as 12 years old, make a living by smashing devices and burning away useless rubber and plastics to get to the valuable

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

deposits inside.4 Injuries are common, including burns, wounds and back problems, and many people suffer from respiratory problems, skin infections, chronic nausea, anorexia and migraines. Most die of cancer before they reach 30.5 The human rights implications of these e-waste graveyards are vast. They threaten peoples’ rights to life, health, food, and a safe environment. The methods of recycling materials are dangerous and contravene international standards on workers’ rights and child labour.6 In 1989, the Basel Convention on the Control of Transboundary Movements of Hazardous Wastes and their Disposal was adopted to govern the proper processes for dealing with toxic waste. The Basel Action Network (BAN) is a US-based NGO that engages in advocacy and advisory work surrounding this issue, and investigates claims of improper recycling and breaches of the convention. Domestically, investigations into Australian e-waste recycling programs have highlighted a need for stricter regulation and scrutiny in the area. Recently, a GPS tracker hidden in an old computer marked for recycling by stationery retail chain Officeworks turned up in a junkyard in Thailand, and the BAN says that 14% of trackers in similar schemes that promise recycling have wound up in landfill.7 Instead of exporting it to developing countries, the governments of Australia and other advanced economies need to implement more comprehensive laws surrounding the proper disposal of electronic waste. Much of the current policy is riddled with holes; a mixed bag of


PAGE 07

©Von Wong www.vonwong.com

voluntary industry standards, national laws, and inconsistent regulations from state and local authorities. From a practical perspective, Australia is already leading the way in technological development to transform e-waste into useful new products. This year, UNSW Sydney’s Centre for Sustainable Materials Research and Technology (SMaRT Centre) launched the world’s first microfactory. In deconstructing old devices like smartphones and laptops, elements are separated at the

1. Baldé, C. P. et al (2017) ‘The Global E-waste Monitor 2017’, United Nations University, International Telecommunication Union & International Solid Waste Association, <https://www.itu.int/en/ ITU-D/Climate-Change/Documents/GEM%202017/Global-Ewaste%20Monitor%202017%20.pdf> 2. UN News (2017) ‘Electronic waste poses ‘growing risk’ to environment, human health, UN report warns’, 13 December 2017, <https://news.un.org/en/story/2017/12/639312-electronic-wasteposes-growing-risk-environment-human-health-un-report-warns#. Wk7u7bQ-dTZ> 3. J Heimbuch (2011) ‘E-waste harms human health; new research details how’, Treehugger, 3 June 2011, <https://www.treehugger. com/clean-technology/e-waste-harms-human-health-newresearch-details-how.html> 4. M Hardy (2018) ‘The hellish e-waste graveyards where computers are mined for metal’, Wired, 1 August 2018, <https://www.wired. com/story/international-electronic-waste-photographs/>

micro-level and transformed into useable and valuable materials – circuit boards become copper and tin, and glass and plastics can be converted into the micromaterials used in industrial ceramics and 3D printing.8 As policy and practices evolve, change is also needed on an individual level. With innovation on an exponential rise, we need to not only recycle our old devices, but also take a hard look at the global impacts of our levels of technological consumption.

5. K McElvaney, photographer (2014) ‘Agbogbloshie: the world’s largest e-waste dump – in pictures’, The Guardian, 28 February 2014, <https://www.theguardian.com/environment/gallery/2014/ feb/27/agbogbloshie-worlds-largest-e-waste-dump-in-pictures> 6. Centre for International Environmental Law (2015) ‘Human rights impacts of e-waste’, report, <https://www.ciel.org/wp-content/ uploads/2015/10/HR_EWaste.pdf> 7. A Salleh (2018) ‘E-waste exports highlight need for tighter controls on ‘unethical and irresponsible’ trade’, ABC News, 16 August 2018, <http://www.abc.net.au/news/science/2018-08-16/ australian-e-waste-exports-to-developing-countriesunethical/10119000> 8. UNSW Media (2018), ‘World-first e-waste microfactory launched at UNSW’, UNSW Sydney, 4 April 2018, <http://newsroom.unsw. edu.au/news/science-tech/world-first-e-waste-microfactorylaunched-unsw>


PAGE 08

©Von Wong www.vonwong.com

#RETHINKRECYCLEREVIVE Technology is indispensable to photo artist Von Wong, whose ambitious works wield the power of social media to capture the mass imagination. If his images go viral, that’s only half of Von Wong’s job done. The other powerful motivator for the Canadian-born artist is to raise awareness of pressing social and environmental issues.

“I want to start creating work that supports the efforts of individuals and organisations around the world trying to protect the world we live in - whether it’s animals, people or places,” he says. “I want to help tell their stories in an epic and beautiful way.”

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018


PAGE 09

Earlier this year, he turned his lens on tech, to start a conversation about the mass of waste we are creating with our obsolete gadgets. Working with computer giant Dell, Von Wong used 1,860kg of e-waste to build the futuristic sets. Incredibly, this is the approximate amount of e-waste an American might use over their lifetime.1 Fifty volunteers summoned on social media spent 10 days sorting through the mountains of e-waste. They built a wall of peripherals out of the mess of mice, adapters and wires they untangled. A sand artist helped with the concentric circle design of the laptops, and a body painter and make-up artist transformed model Clara Cloutier into a cyborg over eight hours. This issue’s cover features one of the dramatic images from the shoot. The elaborate sets lasted only one day before being returned to Dell’s recycling plant – but Von Wong hopes the images will live on, and continue to be shared and discussed. Now that electronics are so quickly considered outdated and replaced, recycling programs that divert these products from landfill and recycle them responsibly, without negative effects to communities, are more important than ever.

VON WONG To find out more follow Von Wong on Instagram @VonWong and on Twitter @thevonwong

1. S Leahy (2017) ‘Each US Family Trashes 400 iPhones’ Worth of E-Waste A Year’ National Geographic 13 December 2017 <https://news. nationalgeographic.com/2017/12/e-waste-monitor-report-glut/>


PAGE 10

REFLECTIONS ON THE UN CONSULTATION ON ‘BIG DATA – OPEN DATA’ JOSEPH CANNATACI Professor Joe Cannataci was appointed UN Special Rapporteur on the right to privacy in July 2015. He is the Head of the Department of Information Policy & Governance at the Faculty of Media & Knowledge Sciences of the University of Malta. He also holds the Chair of European Information Policy & Technology Law within the Faculty of Law at the University of Groningen where he co-founded the STeP Research Group. An Adjunct Professor at the Security Research Institute and the School of Computer and Security Science at Edith Cowan University Australia, a considerable deal of Joe’s time is dedicated to collaborative research. He has written books and articles on data protection law, liability for expert systems, legal aspects of medical informatics, copyright in computer software and co-authored various papers and textbook chapters on self-regulation and the Internet, the EU Constitution and data protection, online dispute resolution, data retention and police data. His latest book “The Individual & Privacy” is published by Ashgate (March 2015). In 2002 he was decorated by the Republic of France and elevated to Officier dans l’ordre des palmes académiques. He has held or currently holds research grants from the British Academy, the Council of Europe, COST, UNESCO and the European Commission.

ELIZABETH COOMBS Dr Elizabeth Coombs is chair of the UN SRP’s Taskforce on ‘Privacy and Personality’. Elizabeth is the immediate past NSW Privacy Commissioner – a position she held for six years. Elizabeth has had an extensive public service career, holding CEO and senior positions in the NSW public sector, and independent statutory roles such as Commissioner for Fair Trading and Commissioner, Local Government Grants Commission. Her Chief Executive experience includes the Department of Fair Trading, Juvenile Justice and Department for Women. Elizabeth is now based at the University of Malta addressing matters related to the UN SRP mandate. Her doctorate involved cross disciplinary research in political economy, classical economics and industrial relations.

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018


PAGE 11

Allow me to declare my bias. I love Australia and regret that I do not spend as much time there as I would like, and as is warranted. The protection of the right to privacy in Australia is high on my agenda. Australia has the potential to be an important regional leader in protecting the right to privacy but it appears to be heading in a different direction.

In 2016, a ten per cent sample of thirty years’ worth of Australians’ Medicare and Pharmaceutical Benefit Scheme data, supposedly de-identified, was publicly released by the Australian Department of Health. Before it was taken offline, the dataset was reportedly downloaded 1,500 times. Senior Australian officials have told me that any reidentification was only theoretical. But let me dispel this notion – the names of people who, with very high degrees of probability, were potentially able to be re-identified were provided to the Government by a research team that advocates for ‘responsible re-identification in the public interest’ of datasets put in the public domain by the Government.1 I understand that the names of Australians who are potentially re-identifiable have been provided on more than one occasion. Other developments such as the proposed Telecommunications and Other Legislation Amendment (Assistance and Access) Bill 2018 coupled with Australia’s limited human rights and privacy protections – there is no constitutional protection for privacy; no Bill of Rights that enshrines privacy, and no tort of privacy – are of major concern to me. Unlike its neighbour New Zealand, Australia has failed the European adequacy assessments for data protection. This means that Australia does not have preferred trading status. THE UNITED NATIONS PRIVACY MANDATE In 2015, I was honoured to be appointed as the UN’s inaugural Special Rapporteur on the Right to Privacy, and in 2018, I was further honoured when my mandate was extended to 2021. In my first term I established five thematic action stream taskforces. The first was on Big Data – Open Data and chaired by Professor David Watts, former Victorian Privacy Commissioner. This taskforce considered the human right to privacy, the advent of modern data technologies

enabling the collection, storage, and analysis of huge quantities of data – commonly known as ‘Big Data’. The Taskforce also considered the trend by governments to release into the public domain datasets held by public sectors - known as ‘Open Data’.2 In October 2017, I presented to the General Assembly the interim report of the Big Data – Open Data taskforce and advised I would conduct an international consultation in Australia before finalising the report in 2018.3 I’m pleased to say that the consultation event held in Sydney in July of this year was extremely successful. It addressed best practice in deploying Big Data – Open Data, and raised for me a number of important issues: the need for greater rigour in releases of ‘Open Data’, Indigenous Data Sovereignty, the limitations of the Australian Consumer Data Right vis-a-vis international data protection reform, and the need for a remedy for serious invasions of privacy. INDIGENOUS DATA SOVEREIGNTY I have studied the privacy culture of Indigenous Australians for many years. Few people realise that theirs is one of the most sophisticated, lived expressions of privacy involving individual, familial and group privacy implemented through behaviours, rites, and practices surrounding private and communal spaces.4 I was extremely pleased that the consultation explored, albeit in a small manner, Indigenous Data Sovereignty - a global movement concerned with the collective rights of Indigenous peoples to govern the collection, access, analysis, management, dissemination and re-use of data both about them and collected from them. Indigenous Data Sovereignty is supported by Indigenous peoples’ inherent rights of self-determination and governance as described in the UN Declaration on the Rights of Indigenous Peoples.5 CONSUMER DATA RIGHT The consultation discussed the introduction of the Australian Consumer Data Right (CDR). As the Chairman of the ACCC has stated, the CDR is a data portability right; it is not a privacy measure or a data protection mechanism.6 It will give consumers greater ease in moving their information from one institution to another but it does not constitute a privacy or data protection measure. The discussion at the consultation saw this ‘right’ as potentially enabling greater accessing of personal data by third parties under the mantle of enhancing consumer rights and increasing competition in the marketplace.7


PAGE 12

PERSONAL INFORMATION AND ‘OPEN DATA’ Let me be clear, Big Data can provide significant benefits but these benefits need not come at the expense of the right to privacy. What is of concern is the release as Open Data of information that can identify or lead to the identification of individuals and communities. A key challenge for releasing data publicly as Open Data is the absence of a way to unambiguously determine if there is personal information in supposedly de-identified datasets or aggregated data. In my report just provided to the UN General Assembly, I recommended that detailed unitrecord level data (identifiable data) should not be disclosed or published online without the data subject’s consent.8

Without the ability to seek remedy for serious breaches of privacy, such breaches are rendered even more grievous. Privacy is not an esoteric notion but a critical mechanism in a digital society, for example, to protect women fleeing domestic violence.

1. V Teague, C Culnane and B Rubinstein (2017) ‘The simple process of re-identifying patients in public health records’, Pursuit. University of Melbourne, 18 December 2017, <https://pursuit. unimelb.edu.au/articles/the-simple-process-of-re-identifyingpatients-in-public-health-records> 2. ‘Report of the Special Rapporteur on the Right to Privacy’ (advance unedited version), UN General Assembly. A/72/43103, seventysecond session, 19 October 2017, <http://www.ohchr.org/ Documents/Issues/Privacy/A-72-43103_EN.docx>. The report was submitted after the deadline to reflect the most recent developments. 3. Written submissions raised matters concerning vulnerable groups; principles for data analytics; the EU General Data Protection Regulation; the sensitive nature of health information; the approaches of different countries; the complexities of big data used in legal proceedings, and, the possible impact of free trade proposals. 4. J Cannataci (ed) (2015) ‘The Individual and Privacy’, The Library of Essays on Law and Privacy, Vol 1(1), Ashgate; J Cannataci (2016) ‘Public lecture by UN Privacy Rapporteur, Joe Cannataci’, lecture, ANU College of Law, Finkel Lecture Theatre, 17 May 2016. 5. T Kukutai and J Taylor (eds) ‘Data Sovereignty for Indigenous Peoples: Current practice and future needs’ pp 1-24 in T Kukutai and J Taylor (eds) (2016) Indigenous Data Sovereignty: Towards an Agenda. CAEPR Research Monograph, 2016/34 (Canberra: ANU Press); Snipp, M. (2016) ‘What does data sovereignty imply: what does it look like?’, pp. 39-56 in T Kukutai and J Taylor (eds) (2016). 6. Australian Government (2018), ‘Consumers’ right to their own data is on its way’, Australian Competition and Consumer Commission,

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

A robust framework for the application of big data analytics also requires the ability to seek remedy, the importance of which has been recommended by Law Reform Commissions and bodies within Australia.9 The right of access to remedy is laid down in international instruments that Australia has signed and ratified.10 Australia is a party to the seven core international human rights treaties including the International Covenant on Civil and Political Rights in which, at Article 17, the right to privacy is articulated. Strengthening the structural arrangements for regulating privacy protection and compliance in Australia requires starting from a human rights perspective and using approaches that meet international acceptance and standards. I wish to thank UNSW Sydney, the Optus Macquarie University Cyber Security Hub and the University of Technology for supporting me to conduct the consultation in Sydney, as well as government representatives, civil society organisations and individuals.11 I welcome ongoing dialogue and collaboration on the report, and other issues concerning the promotion and protection of the right to privacy.

7.

8.

9.

10.

11.

press release, 16 July 2018, <https://www.accc.gov.au/mediarelease/consumers-right-to-their-own-data-is-on-its-way> J Cannataci, UN Special Rapporteur on the Right to Privacy (2018), ‘Session 8: Critical Elements of Best Practice for Big Data – Open Data’, UN Consultation: Big Data – Open Data, Sydney Australia, 27 July 2018. ‘Report of the Special Rapporteur on the Right to Privacy’ (advance unedited version), UN General Assembly. A/73/45712, seventy-third session, 17 October 2018, <https://www.ohchr.org/Documents/ Issues/Privacy/SR_Privacy/A_73_45712.docx>. The report was submitted after the deadline to reflect the most recent developments. NSW Law Reform Commission April 2009; Victorian Law Reform Commission May 2010; Australian Law Reform Commission September 2014; South Australian Law Reform Institute March 2016, and the NSW Parliament Standing Committee on Law and Justice March 2016. Information on Australian implementation of the ICCPR can be found at: Department of Foreign Affairs and Trade (DFAT), ‘International Covenant on Civil and Political Rights’, Australian Government, updated 30/8/10 <http://www.info.dfat.gov.au/Info/Treaties/Treaties. nsf/AllDocIDs/8B8C6AF11AFB4971CA256B6E0075FE1E> Support also provided by: UN Office of the High Commissioner of Human Rights, Allens Hub UNSW Sydney, L-Universita ta’ Malta, University of Groningen The Netherlands, Grand Challenges UNSW Sydney, Australian Human Rights Institute at UNSW Sydney, and the Schools of Mathematics; Social Sciences; Built Environment; Computing and Engineering Sciences, and Law.


PAGE 13

PUTTING HUMAN VALUES INTO THE MACHINE EDWARD SANTOW Edward Santow has been Human Rights Commissioner at the Australian Human Rights Commission since August 2016. Ed leads the Commission’s work on detention and implementing the Optional Protocol to the Convention Against Torture (OPCAT); refugees and migration; human rights issues affecting LGBTI people; counter-terrorism and national security; technology and human rights; freedom of expression; and freedom of religion. Ed’s areas of expertise include human rights, public law and discrimination law. He is a Senior Visiting Fellow at the University of New South Wales (UNSW), and serves on a number of boards and committees, including the Australia Pro Bono Centre. In 2009, Ed was presented with an Australian Leadership Award, and in 2017, he was recognised as a Young Global Leader by the World Economic Forum. From 2010-2016, Ed was chief executive of the Public Interest Advocacy Centre, a leading non-profit organisation that promotes human rights through strategic litigation, policy development and education. Ed was previously a Senior Lecturer at UNSW Law School, a research director at the Gilbert + Tobin Centre of Public Law and a solicitor in private practice.

Human rights must be at the centre of the new technology revolution, writes Human Rights Commissioner Edward Santow Technology is moving faster than at any time in human history. This brings vast potential for economic, social and other benefits. But there are also risks. Many are only just coming into focus: that our news is vulnerable to manipulation; that good workers can lose their jobs because of a poorly-written algorithm; that artificial intelligence can create new and hidden forms of discrimination. To date, human rights have not been at the centre of the new technology revolution. This must change. As new technology reshapes our world, we should pursue innovation that reflects our values: equality, fairness and liberal democracy. And we must also combat the threat that new technology presents of worsening inequality and disadvantage. On 24 July 2018, the Australian Human Rights Commission launched the Human Rights & Technology Project with a major conference that took place in Sydney. About 450 people from civil society, government and industry gathered

to discuss the far-ranging implications of new technologies for the promotion and protection of human rights. It was clear that we Australians expect our rights to be protected and promoted in the design, development and implementation of new technologies. We have seen how social media can be a tool for inclusion and accountability; a means to amplify silenced voices and help atomised individuals form more caring communities. But those same platforms have also fuelled the emergence of new avenues for hate, discrimination and the erosion of privacy. In our work on human rights and technology, the Commission is particularly interested in the phenomenon of algorithmic bias. An algorithm is simply a set of rules used to make a calculation, prediction or solve a problem. Algorithmic bias arises where an algorithm trawls through a source of aggregated personal information and produces results that exhibit bias against a particular group. Data isn’t neutral. Where an algorithm relies on historical data, historical prejudice can be imported into the algorithm’s process.


PAGE 14

For example, if an algorithm is applied to historical data on employment in the construction industry, the algorithm would be more likely to yield results that diminish the role of women, because women historically have been underemployed in this industry. So it is, then, that groups affected by algorithmic bias tend to match those who have historically suffered discrimination – such as particular ethnic groups, people with disability, women and others. So far, the spotlight has fallen on examples of algorithmic bias overseas. Like a ProPublica investigation that found an AI-powered tool used by some US justice systems to help make decisions about bail and sentencing was twice as likely to classify African American defendants as medium or high risk compared to similar Caucasians.1 The tool, known as the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), was instructed not to consider race as a factor. The problem almost certainly lay with the historical data relied on by COMPAS. We know African Americans have faced more police scrutiny, especially in the past. They have been more likely to receive heavier sentences and more likely to be convicted for crimes associated with poverty. To the extent the algorithm relies on such historical data, there is a significant risk that those historical distortions are picked up and entrenched by the AI-powered assessment tool. In her 2018 book, Made by Humans: The AI Condition, Ellen Broad considered how this phenomenon could be present in Australia. She described the development of a model in NSW to predict recidivism rates in respect of domestic violence. The model used the NSW Bureau of Crime Statistics and Research (BOCSAR) Reoffending Database. It took data on age, gender, postcode, Indigenous status, prior criminality and/or history of domestic violence. The data showed reoffenders were more likely to be young, male, Indigenous and living in a postcode that was socioeconomically disadvantaged. The model was then trained to predict the likelihood of domestic violence recidivism. The model worked relatively well in some ways. However, the model over-estimated the likelihood of young people and Aboriginal people committing domestic violence, and it underestimated the risk among people over 45 years. It is reasonable to suspect that the model was making these errors because it assigned higher risk to groups of

people who have historically been subjected to closer and harsher scrutiny by the criminal justice system. Another Australian example that warrants attention is the algorithm used by the NSW Police to decide who is placed on the Suspect Target Management Plan (STMP). Individuals on the STMP are subjected to close monitoring by the police. Although less than 3 per cent of the state’s population is Indigenous, it was revealed in 2017 that more than half of those on the STMP list were Aboriginal or Torres Strait Islander. The Commission is working to identify when these sorts of problems of algorithmic bias might arise and how we can address them. But these threats to our rights shouldn’t inhibit exciting efforts to use new technology as a tool to foster inclusion and accessibility. Speaking at our conference, Emma Bennison from Blind Citizens Australia said she was brought to tears by an app that connects people who are blind or have low vision to a trained professional agent who gives hands-free assistance. “Not since the time that I got my first personal computer in 1988 have I experienced such a strong emotional response to a piece of technology as I have to this, because it really does spell liberation for me,” Bennison said. “I was able to … come out of an unfamiliar building, decide I wanted to go and have a coffee, call up AIRA and go: ‘I’d like to find a coffee shop nearby’ and they were able to direct me to the coffee shop, help me read the menu and find a table. “All of that might sound really simple to all of you but for me that was just incredibly liberating.” The Human Rights & Technology Project aims to create a roadmap for responsible and inclusive innovation in Australia. And we knew the only way to do this is by working collaboratively with industry, government, academia and civil society. The Commission has released an issues paper to assist people to participate in this process and is taking written submissions and hosting consultations. Human rights practitioners are encouraged to get involved. A discussion paper is due to be published in the first half of 2019 and a final report and recommendations expected by early 2020. To learn more or make a submission visit tech.humanrights.gov.au

1. Julia Angwin, Jeff Larson, Surya Mattu, Lauren Kirchner (2016) ‘Machine Bias’, ProPublica, 23 May 2016, <https://www.propublica.org/ article/machine-bias-risk-assessments-in-criminal-sentencing>

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018


PAGE 15

HIGH-TECH HEAT: LAW ENFORCEMENT IN THE FOURTH INDUSTRIAL REVOLUTION LAURA MELROSE

By nature, the law is reactive – it cannot enforce and protect what does not yet exist. With technological innovation moving at such rapid speed, there is often a delay between when technologies are put into practice and when the regulations surrounding their use are comprehensive enough to manage the moral and legal issues they present. This can have serious consequences for civil liberties, privacy and the application and protection of human rights. The gradual broadening of police powers, diminishing of privacy rights, algorithms that predict criminal behaviour, and the use of robots in policing are just some of the modern phenomena that pose new challenges to the traditional legal system. NEW SOUTH WALES, AUSTRALIA New South Wales Police (NSW Police) use a predictive policing system under which people are classified as being at ‘high-risk’ of offending and subsequently placed on a watch list. Having one’s name on this list gives the police extra powers. They can stop and search without reasonable cause or a warrant, keep tabs on a person’s movements, and even perform unannounced home visits at all hours of the day or night. The watch list forms part of the Suspect Target Management Plan, or STMP. It comprises approximately 1800 people, over 55% of whom are Indigenous1 and approximately 400 of whom are children.2 NSW Police has repeatedly refused to disclose the methods by which people are placed on the list, which raises issues of accountability. In a 2017 report, the Youth Justice Coalition (YJC) surmised that the selection process may be some sort of algorithm developed to calculate a person’s risk of offending.3 The risk assessment tools are not available to either the listed individual or the public, however the overrepresentation of Indigenous people on the suspect list indicates a racial skew. Algorithms are written by people, and are therefore susceptible to algorithmic bias.4 The STMP represents a dangerous shift in the way police work is conducted. Traditionally, policing is about investigating a crime that has already been committed and determining who is responsible. The STMP operates on a model of algorithmic guesswork, making predictions about who may be about to commit a crime and attempting to prevent that crime from occurring. In doing so it curtails the liberties of citizens who have not actually broken the law. As of the 2016 Census, Aboriginal and Torres Strait Islander peoples represent 2.8 per cent of the Australian population.5 To then have Indigenous people make up more than half of the secret STMP watch list points toward racial profiling on a staggering scale.


PAGE 16

A media release by Greens MP David Shoebridge asserts that Indigenous young people are about 19 times more likely than non-Indigenous youth to be stopped by police under the STMP, and under the age of 15, that statistic increases to 31 times.6

The effect of this style of policing on Indigenous communities is incredibly harmful. It creates distrust, anger and division where police should be attempting to foster cooperation and conciliatory practices, and creates a risk that tense relations will continue as they grow up. ‘This type of heavy-handed proactive policing is very damaging to the relationship between young people and the police,’ said co-author of YJC report Dr Vicki Sentas, Senior Lecturer at UNSW Law and Redfern Legal Centre Police Powers Clinic.7 NSW Police responded to media reports on the STMP by describing the program as a ‘crime prevention strategy’ which is overseen by a senior police officer in every case. It said people identified in the STMP are treated with ‘respect and tolerance’.8 David Shoebridge argues, ‘There is no legislative basis for the STMP and there is no expert evidence that says it reduces offending or makes the community and safer.’ Further, the YJC report asserts that the STMP undermines key rehabilitation objectives of the justice system. In a country that has seen 136 Indigenous deaths in custody in the past decade, including 16 in NSW, police attention should be on developing practices to keep Indigenous people out of custody, not increase their likelihood of being incarcerated.9 NEW ZEALAND Across the ditch, customs officers in New Zealand can now require the digital passwords of travellers who are entering the country. The ‘digital strip search’ law applies to foreign visitors and citizens alike, and people who refuse will be met with a fine of NZD$5000 and may have their device confiscated.10 In order to demand passwords to laptops, smartphones or other devices, customs officers must have a ‘reasonable cause to suspect’ criminal activity, but once obtained, they are permitted to search, download and copy any data that is saved on the device itself. This does not include cloud-based data or internet search histories or documents. Critics of the new policy have called it a blatant breach of privacy. ‘Smartphones have become an extension of our very selves’, says University of Wollongong Professor

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

Katina Michael.11 They contain vast amounts of sensitive personal information, as well as intelligence, intellectual property, and competitive or industry data. NZ Council for Civil Liberties chairman Thomas Beagle classified the laws as a grave invasion for not only the owner of the device, but every person within their circle of communication.12 Australia is likely to follow New Zealand’s example in the near future. Currently, Australian Border Force (ABF) officers are empowered to examine devices, hold them for up to two weeks and/or copy the data from them, but cannot request passwords. Home Affairs Minister Peter Dutton is seeking to change that, and has recently tabled prospective legislation that, if passed, will enable ABF officers to compel the unlocking of devices, and increase the maximum penalty for non-compliance from two to five years.13 DUBAI, UNITED ARAB EMIRATES One technological step further and we are confronted with the futuristic but very current reality of robot police officers. In 2017, Dubai rolled out the world’s first real RoboCop, an adapted REEM humanoid robot that sports a computer touch-screen ‘chest’ for crime reporting and photographic ‘eyes’ to identify suspects and transmit live images to human police. While the intention is to mainly deploy this prototype at tourist hubs as a novelty alongside Lamborghini and Ferrari patrol vehicles, that may be only the beginning.

Dubai aims to have a 25% robot police force by 2030, including machines that can pursue, biometrically scan and identify offenders.14 Brigadier Khalid Nasser Al Razooqi declared that by the same year, Dubai would see its first ‘smart police station’ without the need for human employees. Recent years have seen heavy investments in artificial intelligence technologies to help predict crime, accidents and alleviate traffic congestions. Facial recognition is being integrated into CCTV, and drones, robots and automated patrol vehicles will be deployed to manage simpler tasks like issuing fines and monitor parking and traffic offences. The use of artificial intelligence and robotics in policing raises considerable ethical issues. When human beings make mistakes, they can be held to account via established mechanisms of retribution. Machines, however, cannot be punished or sanctioned in the same way. Vicarious responsibility is significantly more


PAGE 17

RUSSIA

While Dubai has ruled out giving its robot police officer a gun, Russia is developing a combat-ready android called FEDOR (Final Experimental Demonstration Object Research) that can apparently lift weights, operate motor vehicles, and fire weapons. Deputy Prime Minister Dmitry Rogozin advocates for weapons training as a way to build the machine’s artificial intelligence, helping it learn to prioritise and make instant decisions.15

Illustration: ©Sarah Holmlund/Shutterstock

complicated – who is to blame, the manufacturer, programmer, owner, operator, or someone else entirely? This raises further questions. What if a robot arrests someone, and in doing so, injures that person or an innocent bystander? Will the robots be armed like regular police officers?

The issues with FEDOR are many, but two of the obvious ones are algorithmic bias – as discussed with regard to the STMP – and accountability. Artificial intelligence technology is simply not advanced enough for this to be safe, and generalised AI, or ‘learning from one domain of knowledge and then transferring it to another’ like a human does, is still in the distant future.16 Integrating new technologies into police work requires careful consideration. On one hand, it has the potential to significantly reduce harm to citizens and human police and facilitates efficient information sharing, but on the other, it presents threats to human rights of privacy, ethical justice principles and civil liberty. One thing is clear – as each new innovation is introduced, individual human rights protection will continue to vie for attention against security and defence interests.

1. M McGowan (2017) ‘More than 50% of those on secretive NSW police blacklist are Aboriginal’, The Guardian, 11 November 2017, <https:// www.theguardian.com/australia-news/2017/nov/11/more-than-50-of-those-on-secretive-nsw-police-blacklist-are-aboriginal> 2. D Shoebridge (2018) ‘NSW Police secret blacklist is nothing more than racist policing’, The Greens, media release, 15 March 2018, <https:// davidshoebridge.org.au/2018/03/15/media-release-nsw-police-secret-blacklist-is-nothing-more-than-racist-policing/> 3. V Sentas and C Pandolfini (2017) Policing Young People in NSW: A Study of the Suspect Targeting Management Plan, report (Sydney: Youth Justice Coalition NSW) <https://www.piac.asn.au/wp-content/uploads/2017/10/17.10.25-YJC-STMP-Report.pdf> 4. For more on algorithmic bias, see Edward Santow’s article ‘Putting human values into the machine’ in this issue of the Human Rights Defender. 5. Australian Bureau of Statistics (2017) ‘Census: Aboriginal and Torres Strait Islander population’, Australian Government, media release, 27 June 2017, <http://www.abs.gov.au/ausstats/abs@.nsf MediaRealesesByCatalogue/02D50FAA9987D6B7CA25814800087E03?OpenDocument> 6. D Shoebridge (2018) op. cit. 7. Quoted in ‘Secret NSW Police policy harmful to young people’, Public Interest Advocacy Centre, media release, 25 October 2017, <https:// www.piac.asn.au/2017/10/25/secret-nsw-police-policy-harmful-to-young-people/> 8. A Lavoipierre (2017) ‘Children as young as 10 ‘on secret NSW Police blacklist’; suspects ‘routinely harassed’, ABC News, 25 October 2017, <https://www.abc.net.au/news/2017-10-25/children-as-young-as-10-on-police-blacklist-routinely-harassed/9083108> 9. n.a. (2018) ‘Deaths inside: Indigenous Australian deaths in custody’, The Guardian, 31 August 2018, <https://www.theguardian.com/australianews/ng-interactive/2018/aug/28/deaths-inside-indigenous-australian-deaths-in-custody> 10. S Hickey and U Nedim (2018) ‘New Zealand customs can now require your passwords’, Sydney Criminal Lawyers, 8 October 2018, <https:// www.sydneycriminallawyers.com.au/blog/new-zealand-customs-can-now-require-your-passwords/> 11. R Curley (2018) ‘New Zealand customs demands your mobile device passwords’, Business Traveller, 6 October 2018, <https://www. businesstraveller.com/business-travel/2018/10/06/new-zealand-customs-demands-your-mobile-device-passwords/> 12. S Hickey and U Nedim (2018) above n 8. 13. S Hickey and U Nedim (2018) ‘ABF and police powers to be further expanded’, Sydney Criminal Lawyers, 22 September 2018, <https://www. sydneycriminallawyers.com.au/blog/abf-and-police-powers-to-be-further-expanded/> 14. A Al Shouk (2018) ‘Dubai police to deploy robotic patrols’, Australasian Policing, Vol 10(1): 56. 15. T Page (2017) ‘The inevitable rise of the robocops’, CNN Business, 22 May 2017, <https://edition.cnn.com/2017/05/22/tech/robot-policeofficer-future-dubai/index.html> 16. B Goertzel (2014) ‘Artificial general intelligence: concept, state of the art, and future prospects’, Journal of Artificial General Intelligence, Vol 5(1): 1, 41; P Asaro (2016) ‘“Hands up, don’t shoot!”: HRI and the automation of police use of force’, Journal of Human-Robot Interaction, Vol 5(3) (December 2016): 55.


PAGE 18

ACCESS ALL AREAS? TELECOMMUNICATIONS AND HUMAN RIGHTS IN PAPUA NEW GUINEA SARAH LOGAN

MIRANDA FORSYTH

Sarah Logan is a Postdoctoral Research Fellow at UNSW Law and a member of the Allens Hub for Technology, Law and Innovation. She completed her PhD in International Relations at the Australian National University. Her current research focuses on the geopolitics of technology, especially the relationship between states and telecommunications companies.

Miranda Forsyth is Associate Professor in the School of Regulation and Global Governance at The Australian National University. She formerly lectured in criminal law at the University of the South Pacific. Her research focuses on the possibilities and challenges of the interoperation of state and non-state justice and regulatory systems, particularly in Melanesia.

A 2016 Resolution by the UN Human Rights Council makes it clear that access to the internet, and the way it is used, can impact a wide range of human rights.1 Papua New Guinea’s (PNG) rapid embrace of mobile phone technology and the internet has made these multiple links between human rights and technology increasingly relevant and important. PNG’s telecommunications landscape has shifted dramatically in recent years. Mobile penetration has increased from approximately 2% coverage in 2006 to over 50% today, and from only 2% of the population accessing the internet to almost 10%.2 For a time, PNG had the fastest growing Facebook population in the AsiaPacific region, and today almost 80% of the country’s internet-using population uses Facebook.3 INTERNET ACCESS ADVANCES ENJOYMENT OF MANY HUMAN RIGHTS Access to the internet and mobile telephony (in PNG, as in many developing countries, mobile phones are the internet) is having profound consequences across a wide range of development-related fields. Innovative, small-scale development projects involving mobile phones are having real impact in diverse areas such as maternal health, microfinance, and teacher education. These initiatives have taken place almost entirely outside the government, and are funded by donors, corporate actors and in many cases by Digicel, a telecommunications company specialising in developing markets. Digicel controls almost 90% of the mobile market in PNG and has taken on the mantle of

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

expanding access, building mobile towers and developing other telecommunications infrastructure.

New technology is also an engine of civil rights expression and political participation. Citizens have been able to take matters of transparency and corruption into their own hands on a new scale. Anticorruption blogs and Facebook groups are now an important part of PNG’s political culture for breaking stories and sharing news, and are holding government to account in important ways. LIMITATIONS ON EXERCISE OF HUMAN RIGHTS CAN BE ENABLED THROUGH CONTROLLING ACCESS TO THE INTERNET In recent years, the PNG government has actively sought to limit or shape citizens’ access to the internet and mobile phones, thus potentially interfering with the right to freedom of expression protected under Article 19 of the International Covenant of Civil and Political Rights. Article 19 specifies it applies to “any media” and the 2016 Human Rights Council Resolution explicitly affirmed this includes through the internet. In 2016, the government introduced real-name SIM registration in an attempt to crack down on ‘anti-social behaviour’ and cybercrime. That same year, the government enacted the country’s first cybercrime law.


PAGE 19

The Cybercrime Code Act 2018 legislates against standard cybercrime practices but also covers content offences like harassment and defamation, leading to concerns that it could be used to quash dissent.4 Indeed, PNG bloggers have already been charged with defamation as a result of their posting about corruption.5 Furthermore, in May 2018, the PNG government threatened to cut off access to Facebook, which dominates most Papua New Guinean’s experience of the internet.6 Perhaps predictably for a state with limited capacity, PNG’s government has been unable to effectively enforce real-name SIM registration or to ban Facebook. In both cases the government would have had to enlist the support of Digicel, a company with which it has long been at odds. Enjoying an effective monopoly with its own infrastructure, Digicel is well placed to resist co-operating with the government’s demands for real-name registration or to ban Facebook. The company has long operated largely outside effective government control – even initially operating despite having its licence revoked by the PNG government and failing to respond fully to complaints of predatory pricing.7

DIGICEL’S business model involves targeting remote, underserved markets and investing heavily in infrastructure and community engagement, almost always competing against entrenched government interests in telecommunications markets. A country like PNG does not have the capacity to enforce a licence breach against a company like Digicel, especially when the company has engaged a previously underserved population. In PNG, for example, Digicel conducted its own census (by helicopter) to determine local populations as there is no reliable census in PNG, and then built its networks using helicopters in remote, inaccessible areas not served by government. The company secured 20,000 customers in its first three days of business, and attempts by the government to limit Digicel’s activities caused community uproar.

Photo: ©Richard Eves


PAGE 20

CORPORATIONS AND THE PROTECTION OF HUMAN RIGHTS IN THE DIGITAL CONTEXT In relation to the proposed Facebook ban and SIM registration, the monopoly exercised by Digicel worked to protect citizens’ rights to freedom of expression. However, the government’s limited control over Digicel raises real questions about what obstacles to freedom of expression – and other human rights impacted by the internet – may be generated by Digicel and other corporate providers in the future. In a country like PNG, where the state does not have the capacity or the will to address infrastructure deficiencies, corporate interests have been able to insert themselves into citizens’ lives, for good or bad, introducing great change but also precariousness. Digicel has invested heavily in infrastructure, education facilities, sporting facilities and other important and valuable initiatives through the Digicel Foundation, but it ultimately has few considerations outside its own commercial interests. In the Caribbean, for example – another market dominated by Digicel – the company has come under fire for banning Voice over Internet Protocols (VoIP), like Skype, for its customers and thereby limiting their access to cheaper voice calls than those it provides.8 The company has also pulled Facebook’s Free Basics platform from its service in PNG and a range of countries – likely not because of a fear over Facebook’s influence in a developing state, but because the platform introduces customers in emerging markets to an ecosystem of apps outside Digicel’s control.9

1. The declaration affirms in Art. 1 that ‘the same rights that people have offline must also be protected online, in particular freedom of expression, which is applicable regardless of frontiers and through any media of one’s choice in accordance with articles 19 of the Universal Declaration of Human Rights.’ See A/ HRC/32/L.20. 2. World Bank, ‘Mobile cellular subscriptions (per 100 people), <https://data.worldbank.org/indicator/IT.CEL.SETS. P2?locations=PG> 3. Stat Counter, ‘Social media stats Papua New Guinea’, <http://gs. statcounter.com/social-media-stats/all/papua-new-guinea> 4. PNG is not alone in these endeavours. Real-name SIM registration policies exist in over 90 countries globally, and cybercrime laws have long been part of government responses to legitimate problems of online-enabled crime. See <https://www. lowyinstitute.org/the-interpreter/developing-png-s-cybercrimepolicy-local-contexts-global-best-practice> for a discussion for discussion of PNG’s cybercrime legislation in a global context, and Global System for Mobile Communications (2016) Mandatory Registration of SIM Cards, <https://www.gsma.com/publicpolicy/ wp-content/uploads/2016/04/Mandatory-SIM-Registration.pdf> 5. S Armbruster (2017) ‘PNG election boss Gamato gags namecalling anti-corruption activist’, SBS News, 13 July 2017 <https://

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

FREE BASICS is a Facebook-developed mobile app that gives users access to a small selection of data-light websites and services. The websites are stripped of photos and videos and can be browsed without paying for mobile data.10

For all the improvements wrought by improved technology, then, Papua New Guineans are at the mercy of a corporation, over which their own government appears to have limited control. PNG’s experience raises important questions about how to ensure the enjoyment of human rights in the era of the internet-when corporations have so much autonomy in developing markets. The UN Human Rights framework is binding against states rather than corporate entities. This issue was noted by the former UN Special Rapporteur on Freedom of Expression, Frank La Rue, who highlighted the critical role of corporations in regard to international access and human rights where relationships between states, citizens and the internet are underpinned by corporate intermediaries.11 His 2011 report noted, ‘While States are the duty-bearers for human rights, private actors and business enterprises also have a responsibility to respect human rights’. Sometimes, it seems, access to the internet is not enough in itself to guarantee a furthering of human rights. In PNG, this is certainly the case – the importance and impact of access is undeniable, but the sustainability of that access currently depends on a corporation rather than the state.

6.

7.

8.

9.

10.

11.

www.sbs.com.au/news/png-election-boss-gamato-gags-namecalling-anti-corruption-activist> L Matsakis (2018) ‘PNG Wants to Ban Facebook. It Shouldn’t’, Wired, 30 May 2018, <https://www.wired.com/story/papua-newguinea-facebook-ban/> S Marchall (2007) ‘PNG mobile company to continue operation’, ABC News, 25 July 2007 <http://www.abc.net.au/news/2007-0725/png-mobile-company-to-continue-operation/2513228> G Best (2014) ‘Industry experts agree: regulators must protect Caribbean mobile subscribers’, Sightline (blog), 13 July 2014 <https://www.gerardbest.com/sightline/2014/07/13/industryexperts-agree-regulators-must-protect-caribbean-mobilesubscribers> J Oriel (2018) ‘What happened to Facebook’s grand plan to wire the world?’, Wired, 17 May 2018, <https://www.wired.com/story/ what-happened-to-facebooks-grand-plan-to-wire-the-world/> S West & E R Biddle (2017) ‘Facebook’s Free Basics doesn’t connect you to the global internet – but it does collect your data’, Global Voices advox, 27 July, <https://advox.globalvoices. org/2017/07/27/facebooks-free-basics-doesnt-connect-you-tothe-global-internet-but-it-does-collect-your-data/> Report of the Special Rapporteur on the Right to Freedom of Opinion and Expression, A/HRC/17/27


PAGE 21

UNDERSTANDING AADHAAR: THE UNIQUE IDENTIFICATION AUTHORITY OF INDIA AND ITS CHALLENGES

SMRITI SINGH Smriti Singh works as the Manager – Media and Advocacy at Amnesty International India. Her major work has been in the area of policy research, policy advocacy, women’s rights, and media communications. Before joining the development sector, Smriti worked as a legal journalist in India’s leading national dailies for eight years. A commonwealth scholar, Smriti holds an LLM in International Development Law and Human Rights from University of Warwick in United Kingdom. She completed her LLB degree and BA (Hons.) in Journalism from Delhi University.

In March this year, Twitter was abuzz with memes about the Aadhaar Card in India. At one point #AadhaarMemes were trending on Indian Twitter. While some drew parallels with the government’s decision on demonetisation, others called it ‘Schrodinger’s Aadhaar’ – being both optional and mandatory at the same time. The reason behind this was a government decision making the Aadhaar Card a mandatory identification document for banking, phone and passport services. As March also marked an end of the financial year, panic and confusion was rife. It was in the midst of this confusion that the Supreme Court stepped in and stayed the government’s order. The case before the Supreme Court challenged the constitutional validity of the Aadhaar scheme on the grounds that it would violate the right to privacy. The Supreme Court stated that until it delivers a final verdict in the case, Aadhaar Cards would not be mandatory for banking, phone and passport services. Since its inception, the Indian government’s biometric identification project, also known as the Aadhaar scheme, has been marred with controversy. Other than the right to privacy, concerns were also raised surrounding the impact on personal security and how to enforce government accountability in cases of breach.

However, in order to understand the issue, it is pertinent to take a few steps back and understand what Aadhaar is all about. AADHAAR: HOW IT ALL BEGAN

Aadhaar is a unique 12-digit number issued to each citizen by the Unique Identification Authority of India (UIDAI) by taking their biometric details such as iris scans and fingerprints, and demographic information like date of birth and address. At present, Aadhaar’s database has the records of over 1.12 billion registered users and is rapidly becoming the government’s primary database for public welfare and citizen services schemes. The idea of having a unique identification was first conceived following the Kargil War in 1999, when an inquiry commission known as the Kargil Review Committee was set up by the then-National Democratic Alliance-led Indian government to analyse perceived Indian intelligence failures during the war. The Kargil Review Committee report strongly recommended the use of a national identity


PAGE 22

Image: ©tlorna/Shutterstock

card, particularly for those living in the border areas. The report was endorsed by a group of ministers in 2001, which led to an announcement in 2003 that a “Multipurpose National Identity Card” would be issued to each citizen. The main motive for this expansion was to ensure the welfare of citizens by facilitating access to various government schemes via a single identification document.1 Six years later, under the 2009 United Progressive Alliance government, the UIDAI was formally launched and tasked with documenting every Indian resident. After years of debates and deliberations, the Aadhaar Act finally came into effect on 11 March 2016.

In recent years, the debate around the Aadhaar scheme has intensified as issues like surveillance, data leaks, and violations of privacy and civil liberties come to the fore. As the Supreme Court inches closer to delivering a final verdict in the case, it is pertinent to delve further into the question of whether the government is well equipped to handle something like the Aadhaar database.

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

SHOULD AADHAAR BE MANDATORY? Despite being touted as a tool to enhance the delivery of welfare benefits and services and as ‘the missing link’ in the inclusive economy, Aadhaar has done little to make access to essential services easier for marginalized communities. There have been reports stating that shops charged with providing subsidized food grains as part of the government’s public distribution system to people living in poverty have denied supplies to eligible families—either because they did not have an Aadhaar number; had not linked it to their ration cards to confirm their eligibility; or the authentication of their biometrics such as fingerprints failed.2 Human rights groups and media have reported cases where people starved to death as a result.3 Poor Internet connectivity, machine malfunction, and worn out fingerprints in the case of older people or manual labourers demonstrate some of the problems with biometric authentication. In some cases, people living with HIV/AIDS have decided to stop getting medical treatment or medication because they fear their identities will be disclosed when they are forced to submit Aadhaar numbers to get healthcare benefits.4


PAGE 23

Many persons with disabilities have also been denied benefits because they were unable to obtain Aadhaar numbers.5 Making an Aadhaar Card a prerequisite to access essential services and benefits can obstruct access to several constitutional rights, including the rights of people to food, health care, education and social security. RIGHT TO PRIVACY On 3 January 2018, The Tribune, an English daily, published an investigative report exposing the vulnerability of the Aadhaar database to security breaches. The report suggested that anonymous sellers were offering a service over WhatsApp that provided unrestricted access to details relating to the more than one billion Aadhaar numbers registered in India at the time.6 A month later, India’s Ministry of Finance acknowledged in the upper house of the Parliament that state-owned banks had reported incidents of money being fraudulently withdrawn from bank accounts using customers’ Aadhaar numbers.7 This was followed by another incident where the government of Andhra Pradesh published over 130,000 Aadhaar numbers, along with demographic and some financial details. This was a result of confusion over Aadhaar protocol and an attempt to increase governmental transparency by disclosing the information of public servants. The details were only removed after reports in the local media.8

People applying to get the Aadhaar Card in Auroville, India, September 2017. Photo: ©Marco Saroldi/ Shutterstock

In August 2017, the Supreme Court stated that the right to privacy was part of the constitutional rights to life and personal liberty in response to government arguments in Aadhaar-related petitions that privacy was not a fundamental right.9 The right to privacy is also protected under the International Covenant on Civil and Political Rights (ICCPR), to which India is party. The government claims to have issued over one billion Aadhaar numbers to residents in India, not only citizens, making it one of the biggest biometric databases in the world. The government’s push for mandatory enrolment and its efforts to link the Aadhaar number to a wide range of services raise grave concerns that it could disproportionately interfere with the right to privacy for millions of people. For example, Aadhaar could be used as a mass surveillance technology, and has the potential for harmful abuse if the unique number is linked to data sets that previously existed in individual silos. The extent of unifying a person’s information can be used to access a person’s food habits, language, health, bank details, political affiliation, etc. This prompts fears of increased state surveillance, with the convergence of various databases making it easier for the government to track all information about specific individuals and to target dissent. These fears are heightened by the absence of laws to protect privacy and data protection in India, and the lack of adequate judicial or parliamentary oversight over the activities of intelligence agencies.


PAGE 24

TRANSPARENCY AND ACCOUNTABILITY Certain provisions of the Aadhaar Act and subsequent regulations also raise concerns regarding transparency and accountability. The legislation prevents anyone other than the UIDAI from approaching the courts in the case of a breach or violation of the law. It also fails to set up an adequate or effective grievance system, which is contrary to the ICCPR requirement that countries ensure that anyone whose rights or freedoms are violated has access to an effective remedy. Aadhaar regulations allow the government to deactivate an Aadhaar number for various reasons including for “any other case requiring deactivation as deemed appropriate” by the UIDAI, leaving the broad wording open to misuse.10 In addition, the government is not required to give any prior notice before deactivating an Aadhaar number, which could violate natural justice principles and also put access to essential services at risk. Between 2010 and 2016, the government deactivated 8.5 million Aadhaar numbers, citing no more than reasons provided for under the law.11 Aadhaar does not allow anyone enrolled to opt out or withdraw. Aadhaar regulations do not require the authorities to inform an Aadhaar number holder if their information has been shared or used without their knowledge or consent. MOVING FORWARD There is no denying that the creation of Aadhaar, a central identity database, will have repercussions on the traditional relationship between the state and its subjects. While the impending Supreme Court verdict will decisively impact the

1. K Roy (2017) ‘Analysing Aadhaar through the prism of national security’, Institute for Defence Studies and Analyses, 22 June 2017, <https://idsa.in/idsacomments/analysing-aadhaar-through-theprism-of-national-security_kroy_220617> 2. J Drèze (2018) ‘ Following the grain trail: on India’s public distribution system’, The Hindu, 17 January 2018, <https://www. thehindu.com/opinion/lead/following-the-grain-trail/ article22451645.ece> 3. A Johari (2017) ‘Denied food because she did not have an Aadhaarlinked ration card, Jharkhand girl dies of starvation’, Scroll.in, 16 October 2017, <https://scroll.in/article/854225/denied-foodbecause-she-did-not-have-aadhaar-linked-ration-card-jharkhandgirl-dies-of-starvation> 4. S Tomar (2017) ‘Linking benefits for AIDS patients to Aadhaar triggers privacy concerns’, Hindustan Times, 3 April 2017, <https:// www.hindustantimes.com/bhopal/linking-benefits-for-aidspatients-to-aadhaar-triggers-privacy-concerns/storyiR6HB8RmqPDaNwkX2Oj5EJ.html> 5. S Azad (2017) ’53,000 lose pension for lack of Aadhaar’, The Times of India, 17 December 2017, <https://timesofindia.indiatimes.com/ india/53000-lose-pension-for-lack-of-aadhaar/ articleshow/62101189.cms> 6. R Khaira (2018) ‘Rs 500, 10 minutes, and you have access to billion Aadhaar details’, The Tribune India, 3 January 2018, <https://www. tribuneindia.com/news/nation/rs-500-10-minutes-and-you-haveaccess-to-billion-aadhaar-details/523361.html> 7. U Trivedi (2018) ‘Aadhaar: world’s largest ID database exposed by India government errors’, The Economic Times, 15 May 2018,

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

future of governance, it is yet to be seen whether the government will make the requisite changes to lessen the adverse impact of the technology whose potential is yet to be tapped. As discussed earlier, there have been documented instances of lapses in the protection of data by Aadhaar. Moreover, the threat of super surveillance also looms large. In the light of a citizen’s fundamental right to privacy, how would the government ensure every person’s right to bodily integrity? Civil society in India rejoiced when in 2017 the Supreme Court, in a landmark judgment, called privacy a fundamental right. But how this right will be read into the Aadhaar judgment is something yet to be seen. Activists in India have been relentlessly fighting against the “intrusive” nature of Aadhaar law and exposed its many loopholes that wrongly exclude many from the beneficiary list.12 If the government really wants Aadhaar to be the future, it must first acknowledge that there are flaws in the functioning of the Aadhaar scheme that open the door to misuse and rights violations. Hence the government should, at the very least, order independent investigations into the concerns raised in an effort to address them. Editor’s note: On 26 September 2018, the Supreme Court of India upheld the constitutional validity of Aadhaar.13 According to the verdict, it is no longer mandatory for people to link their bank accounts and mobile numbers with Aadhaar. However, it will remain mandatory for filing of income tax returns. The SC also observed that no person’s rights can be denied on the grounds of lack of possession of an Aadhaar card.

8.

9. 10. 11.

12.

13.

<https://economictimes.indiatimes.com/news/economy/policy/ worlds-largest-id-database-exposed-by-india-government-errors/ articleshow/64169884.cms?utm_source=contentofinterest&utm_ medium=text&utm_campaign=cppst> N Bhaskaran (2018) ‘1.3 lakh Aadhaar numbers leaked from Andhra govt website, linked to personal details’, The News Minute, 25 April 2018, <https://www.thenewsminute.com/article/13-lakh-aadhaarnumbers-leaked-andhra-govt-website-linked-personaldetails-80178> M Ganguly (2017) ‘India’s Supreme Court upholds right to privacy’, Human Rights Watch, 24 August 2017, <https://www.hrw.org/ news/2017/08/24/indias-supreme-court-upholds-right-privacy> Aadhaar (Enrolment and Update) Regulations, 2016, (No. 2 of 2016), s 28(1)(f), accessed via <https://uidai.gov.in/images/ regulation_1_to_5_15092016.pdf>, p. 26. ‘Over 81 lakh Aadhaar numbers deactivated: here’s how to check if yours is still active’, The Indian Express, 11 Sept 2017, <https:// indianexpress.com/article/how-to/how-to-check-if-your-aadhaarcard-is-valid-4801239/> C Chauhan (2017) ‘It is victory of a common citizen, says civil society on SC’s privacy ruling’, Hindustan Times, 24 August 2017, <https://www.hindustantimes.com/india-news/it-is-victory-of-acommon-citizen-says-civil-society-on-sc-s-privacy-ruling/storyyRBFxpmn1zeTmW44HW5FRM.html> Z Saberin (2018) ‘India’s top court upholds constitution (sic) validity of Aadhaar card’, Al Jazeera, 26 September 2018, <https://www. aljazeera.com/news/2018/09/india-top-court-upholds-constitutionvalidity-aadhaar-card-180926054727305.html>


PAGE 25

HUMAN RIGHTS DROWNING IN THE DATA POOL: IDENTITY-MATCHING AND AUTOMATED DECISION-MAKING IN AUSTRALIA

KERRY WESTE AND TAMSIN CLARKE Kerry Weste and Dr Tamsin Clarke are respectively President and Freedoms Committee Chair of Australian Lawyers for Human Rights (ALHR).

Technologies that use biometric data, such as facial or fingerprint recognition, have worked their way into our everyday lives quite seamlessly. Theoretically it’s safer to open your iPhone with your fingerprint rather than a password that is always vulnerable. Identity-matching systems that gather up not only images from our passports and drivers’ licences but also our biographic details such as names, dates of birth, and gender appear to have become both an innocuous norm and, according to government, a necessity in combating threats such as immigration breaches, identity crime and terrorism.1 Arguably, unless we are ourselves criminals, we need not worry about the privacy and data security implications of this technology. However, there is no getting away from the fact that the generalised monitoring which is increasingly proposed by the Federal Government will have an unprecedented impact upon Australians’ civil liberties and that biometric technologies raise very real human rights concerns.2 As Hildebrandt says, ‘advanced profiling technologies answer questions we did not raise. They generate knowledge we did not anticipate, but are eager to apply. As knowledge is power, profiling changes the power relationships between the profilers and the profiled.’3

How can we protect our right to privacy in the face of government control of aggregated data derived from an expanding pool of sources that reveal our personal attributes, behaviour, location and contacts? How do we avoid discriminatory outcomes where governments take data and utilise opaque automated decision-making technology in ways that may significantly impact someone’s life? Are we becoming a police state with identity-matching facilities in every public toilet – as seems to be the trend in China 4 – no doubt at enormous public cost? The human right to privacy is enshrined in Article 17 of the International Covenant on Civil and Political Rights (ICCPR), which specifies that ‘no one shall be subjected to arbitrary or unlawful interference with his (sic) privacy, family, home or correspondence.’ This right may only be permissibly limited by measures that are necessary and proportionate to achieve a legitimate aim, if protected by safeguards and oversight. But there are two problems in Australia – firstly, the human right to privacy in the ICCPR is not fully enshrined in law, and secondly, in our view, the current legislative approach does not impose only proportionate restrictions on that right.


PAGE 26

WHAT IS HAPPENING IN AUSTRALIA?

AUTOMATED DECISION-MAKING

Earlier this year the Federal Government introduced two bills with potentially far-reaching human rights implications for all Australians. The Identity-Matching Services Bill 2018 (IMS Bill) and the Australian Passports Amendment (Identity-Matching Services) Bill 2018 (AP (IMS) Bill) are aimed at providing identity-matching services to government agencies through the new ‘interoperability hub’ managed by the mega-Home Affairs Department.

In Australia, the ability of Ministers to delegate decisions to computer programmes is already well entrenched in legislation.7 There has been little public scrutiny of this development.

Despite references in the IMS Bill to the application of the Australian Privacy Principles, the Bill raises grave misgivings regarding the purposes for which identitymatching can be used, who can access the information, how they will keep the identity information they hold about individuals secure, and whether ‘consent’ from individuals involved will effectively be obtained by coercion. Individuals should know the reason their personal information is collected, and the information should be used only for that purpose. This basic protection, as laid out in the Australian Privacy Principles, is not honoured by the Bills, with the IMS Bill specifically allowing for data obtained for one purpose to be used for other unspecified purposes. The information may also be shared with other countries. This ability to repurpose data represents a complete failure of transparency in relation to the data matching process. The AP (IMS) Bill’s own explanatory memorandum acknowledges potential breaches of several of Australia’s obligations under the ICCPR 5 including: the right to privacy in Article 17, the right to liberty and security of the person in Article 9, and the right to freedom of expression in Article 19. Australians’ rights to peaceful assembly and equality before the law are also likely to be impacted. These Bills give rise to serious concerns that substantial infringements upon individuals’ privacy rights are being given away by government in the name of security. At the same time a door is being left open for those same privacy infringements to be ‘monetised’ for commercial purposes. The inconsistent objectives of these Bills – national security and commercial profit – call into question the intentions of the Bill supporters and the constitutional basis of the legislation. If the Bills are intended as a proportionate and reasonable response to national security concerns (as to which the Commonwealth clearly has constitutional power) how is it then that the Bills also contemplate the use of data for commercial purposes by third parties? 6 Surely this inconsistency of purpose undermines the purported constitutional basis of the Bills?

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018

The AP (IMS) Bill allows Ministerial decisions to be made by computer programmes for any purposes for which the Minister may or must make a decision; or exercise any power or comply with any obligation; or do anything else related to making a decision or exercising a power or complying with an obligation under the Australian Passports Act 2005.8 Such programmes can use the profiles obtained through identity-matching systems to form the basis of automated decision-making. In the European context, the organisation Privacy International has questioned whether decisions informed by identitymatching technology can be non-discriminatory, fair and legal, and has asked how we are to know these things if the process itself is opaque.9 In Australia, best practice principles are set out in the Commonwealth’s Automated Assistance in Administrative Decision-making: Better Practice Guide. The guide states that automated systems that decide – as opposed to helping a decision-maker make a decision – would generally be suitable only for decisions involving nondiscretionary elements. It cautions that automated systems should not automate the exercise of discretion, and that automated systems can be used as an administrative tool to assist an officer in exercising his or her discretion. In these cases, the systems should be designed so that they do not fetter the decision-maker in the exercise of his or her power by recommending or guiding the decision-maker to a particular outcome.10 It appears that the AP (IMS) Bill empowers the Minister not just to use a computer programme to assist him or her in making a decision, but also to leave a decision entirely to the computer programme. The latter event arguably amounts to an unlawful attempt to delegate a discretion. Surely a discretion can only be delegated to a human being who is themselves capable of exercising real and genuine consideration of an issue. Changing the way in which we are governed through automation of governmental decisions raises fundamental human rights concerns - for example where computer programmes are used to assist in decisions not just on day-to-day matters, but on crucial issues such as risk assessments of refugee claims.


PAGE 27

Is a computer capable of adding extrinsic facts to moderate the information it receives? Can it take account of human rights, community values and expectations, considerations of fairness or common sense? The reality is that computer programmes and algorithms are not necessarily neutral and will reflect the intrinsic social biases of the programmers. ‘Algorithmic bias,’ it is noted, ‘is now a widely studied problem that refers to how human biases creep into the decisions made by computers. The problem has led to gendered language translations, biased criminal sentencing recommendations, and racially skewed facial recognition systems.’11 Studies in the US have already found evidence of bias in online advertising, recruiting, facial recognition, bail and sentencing decisions and law enforcement decision-making all driven by purportedly neutral algorithms.12 Edwards and Veale observe that algorithmic systems trained on past biased data which introduce correlations based on race, religion, gender, sexuality, or disability without careful consideration are inherently likely to recreate or even exacerbate discrimination seen in past decision-making.13 A risk of resultant incorrect, unfair or arbitrary decisions is therefore very real. Computer programmes are not generally coded by persons who understand how to interpret the relevant laws and are unlikely to incorporate the common law presumptions, case law and statutory interpretation principles that underlie the correct application of

legislation. But even if the programmers had the expertise of judges and were required to create a ‘paper trail’ showing how all these elements are taken into account in their programme, such a paper trail might still not be comprehensible nor the programme perfect. Given that ‘not even the people who write algorithms really know how they work’ it is effectively impossible to render an algorithmic process completely transparent.14 WHERE DOES THAT LEAVE US? It is clear that what the Australian Government is doing with identity-matching, facial recognition and automated decision-making does not stack up against the tests of necessity, proportionality, adequate safeguards and oversight – with consequent dangers for us all. It is also doubtful if the move towards general surveillance will be effective in terms of national security, prevention of identity fraud, and the other benefits claimed, not least because general surveillance necessarily increases the size of the ‘haystack’ of data to be analysed. For Australians, in stark contrast to citizens in comparable countries across the Western world, these dangers are further amplified by the glaring absence of a federal Human Rights Act and our consequent inability to enforce our human rights domestically. This article draws upon recent submissions by ALHR on the Bills referred to.

1. Identity-matching Services Bill 2018, Explanatory Memorandum, pars 8, 17, 50 and ff, 78 and 79, 122. 2. K Middleton (2018) ‘New domestic intelligence powers’, The Saturday Paper, 22-28 September 2018, <https://www.thesaturdaypaper.com.au/news/ law-crime/2018/09/22/new-domestic-intelligence-powers/15375384006887> 3. M Hildebrandt (2008) ‘Profiling and the rule of law’, Identity in the Information Society, Vol 1, Issue 1, pp. 55-70, <https://link.springer.com/ article/10.1007/s12394-008-0003-1> 4. Agence France-Presse in Beijing (2017) ‘From ale to jail: facial recognition catches criminals at China beer festival’, The Guardian Online, 1 September 2017, <https://www.theguardian.com/world/2017/sep/01/facial-recognition-china-beer-festival> 5. International Covenant on Civil and Political Rights, 999 UNTS 171 (entered into force 23 March 1976) <http://www.ohchr.org/EN/ProfessionalInterest/ Pages/CCPR.aspx> 6. IMS Bill, s 10; E Thomas (2017) ‘Coalition could allow firms to buy access to facial recognition data’, The Guardian Online, 26 November 2017, <https://www.theguardian.com/technology/2017/nov/26/government-could-allow-firms-to-buy-access-to-facial-recognition-data> 7. The Honourable Justice Melissa Perry, “iDecide: the Legal Implications of Automated Decision-making”, Cambridge Centre for Public Law Conference 2014: Process and Substance in Public Law, 15-17 September 2014, <http://www.fedcourt.gov.au/digital-law-library/judges-speeches/justice-perry/ perry-j-20140915> 8. AP (IMS) Bill, s 56A. 9. Privacy International (2018) ‘Data is Power: Profiling and Automated Decision-Making in GDPR’, 9 April 2018, p. 2, <https://privacyinternational.org/ report/1718/data-power-profiling-and-automated-decision-making-gdpr> 10. A Le Sueur (2016) ‘Robot Government: Automated Decision-Making and its Implications for Parliament’, draft chapter in A Horne and A Le Sueur (ed), Parliament: Legislation and Accountability (Oxford, Hart Publishing, 2016), 1 at 14, <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2668201> 11. J Avanitakis and A Francis (2018) ‘Data ethics is more than just what we do with data, it’s also about who’s doing it’, The Conversation, 22 June 2018, <https://theconversation.com/data-ethics-is-more-than-just-what-we-do-with-data-its-also-about-whos-doing-it-98010> 12. N Byrnes (2016) ‘Why We Should Expect Algorithms to Be Biased’, MIT Technology Review, 24 June 2016, <https://www.technologyreview. com/s/601775/why-we-should-expect-algorithms-to-be-biased/>; D Cossins (2018) ‘Discriminating algorithms: 5 times AI showed prejudice’, New Scientist Magazine, 27 April 2018 available at <https://www.newscientist.com/article/2166207-discriminating-algorithms-5-times-ai-showedprejudice/> 13. L Edwards and M Veale (2017) ‘Slave to the Algorithm? Why a right to an explanation is probably not the remedy you are looking for’, Duke Law and Technology Review, Vol. 16, No. 1, p. 28, <https://scholarship.law.duke.edu/cgi/viewcontent.cgi?article=1315&context=dltr> 14. A LaFrance (2015) ‘Not even the people who write algorithms really know how they work’, The Atlantic, 18 September 2015, <https://www.theatlantic. com/technology/archive/2015/09/not-even-the-people-who-write-algorithms-really-know-how-they-work/406099/>


INNOVATE RIGHTS 2019: FIRST SPEAKERS ANNOUNCED

DAVID COOKE

JULIA GILLARD

ANDY HALL

Former Australian prime minister and Chair of the Global Institute for Women’s Leadership at King’s College London, Julia Gillard, will be a special guest speaker at the first Innovate Rights conference, along with Dr David Cooke (Konika Minolta), and Andy Hall (migrant worker rights specialist). In this three-day event, we ask: what are the risks and opportunities for business in engaging with human rights issues? And what can businesses do in practice to respect the rights of workers, customers and communities? We begin with a specialised academic workshop on 14 MAY 2019. The conference on 15-16 MAY 2019 considers human rights in this changing political environment. Innovate Rights is a chance to share ideas across sectors and consider new ways to work together on global challenges.

TO REGISTER YOUR INTEREST AND RECEIVE UPDATES, PLEASE SIGN UP humanrights.unsw.edu.au/innovate-rights

Australian Human Rights Institute

HUMAN RIGHTS DEFENDER  |  VOLUME 27: ISSUE 3 – NOVEMBER 2018


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.