6 minute read

Talking privacy

NICOLE STEPHENSEN

By Nicole Stephensen, Privacy Maven and Partner, at IIS Partners

I read a wonderful book a couple of years ago. It has impacted my work immensely, leading to frank and fearless discussions, moments of clarity around responsible stewardship of data (the personal stuff, the stuff about you and me) and innovative and elegant development of privacy-enhancing features in policy and technology. Yet it has nothing to do with privacy. Nothing and everything, apparently.

I’m talking about The Art of Gathering: How we meet and why it matters by Priya Parker. Her premise is that getting together at a conference, in a boardroom, at a café, over Zoom, over Teams or even with a quick phone call has meaning and can be a powerful experience if we go about such activities the right way.

Just days after finishing the book I had the opportunity to meet Parker at a leadership retreat for privacy professionals and experience firsthand her approach to gathering. Her message was simple but transformative: “We rely too much on routine and the conventions of gatherings when we should focus on distinctiveness and the people involved.”

The nature of my work has changed over the years. There was a time when erroneously sending medical records by fax to the local convenience store instead of the local hospital was an all too frequent privacy breach. Email was not a common form of almost real time communication, and digitisation (of work, life, banking, socialising) was still a twinkle in the eyes of technologists. Fast forward to today and the focus of digitisation has moved beyond communication technologies to managed service provision, governance, the Internet of Things, all things social, insights and trends. All these applications of digital technology have one thing in common: data.

Following the merger of my boutique consultancy, Ground Up Consulting with privacy consultancy IIS Partners in April 2022 my work continues to focus on the intersection of privacy and technology, where information security considerations are a huge part of the privacy discussion, and where both disciplines need a seat at the table to solve today’s wicked privacy problems. When we meet at that table we get the chance to hear each other and understand we share common purposes: to promote good decision making and prevent harm.

Now, back to that book. I see three opportunities to acknowledge the distinctive nature of the privacy discipline and its significance, straddling as it does information security, data governance and risk in our organisations (and the people at the heart of

them all). These opportunities are: to avoid conflating privacy with security; to learn to understand the risk landscape; to use the correct terms for the stuff that matters.

AVOID CONFLATING PRIVACY WITH SECURITY

It is important to answer both the privacy and the security questions that arise from the various technologies, programs, projects and initiatives into which we have professional visibility.

When people representing our cities, companies, not-for-profits, innovators, vendors and platforms start talking ‘data’, I am often brought into these discussions (lamentably, often after a project is already well underway, but I will save the exploration of Privacy by Design for another article). By the time I take my seat at the table, data is likely to be the starting point for the conversation. What do we do with the data? How can we derive value from the data? How can we add more data to the data?

Where the data is about a person or a group of people, my job is to ask, “What about privacy?” This is where it is vital the people being asked the question truly understand the role of a privacy consultant and do not misunderstand the question. When I ask, “What about privacy?” those at the table often hear “What about security?” The latter is a good question for security folks. How do we protect the data? How do we maintain its confidentiality, integrity and availability? But I am not asking those questions.

I am not asking about processes or controls or about building a big fence, physical or digital, around what we want to protect (ie, the data or the systems and other infrastructure underpinning it). I am asking about purpose specification (what do we want from the data?), necessity (do we need all the data?) and proportionality (does the benefit of having and using the data outweigh the privacy risk?).

I am asking how we intend to collect and manage personal information, the kind of data I am most concerned about, in accordance with the law and with community expectations.

When we conflate privacy with security two things can happen: we end up focusing on securing the data, as if it and the infrastructure underpinning it are what we most need to protect or worry about; we lose sight of our primary objective, the fair and transparent handling of personal information pertaining to the community we serve.

LEARN TO UNDERSTAND THE RISK LANDSCAPE

Organisational risks include (but certainly are not limited to) poor information practice, compromised integrity of data or systems and non-compliance with the law. These give rise to outcomes such as regulatory scrutiny, penalties, cancelled contracts and brand damage. The lens through which organisational risks are viewed by many security professionals is often protective and inward-looking: it is focused on avoiding negative outcomes for the organisation.

For privacy professionals, protecting the organisation from harm is a secondary motivator. Our primary aim is the prevention, reduction or elimination of organisational risks that are also privacy risks and where the outcome is harm to a person or group. For anyone unsure what privacy harm looks like, it is worth checking out Dr Dan Solove’s taxonomy on the topic. This identifies multiple harms across four broad categories: information collection, information processing, information dissemination and invasion (Enterprivacy offers a great high-level visual of this taxonomy).

Privacy risk, when viewed as “something that would cause real or perceived harm to a person,” becomes an outward-looking conversation focused on how organisational decisions impact the community we serve.

USE THE CORRECT TERMS

To be seen as an authority in privacy it is important to use terms that are recognised or defined in law. To do otherwise risks confusing the discussion and losing credibility amongst peers.

Take the term ‘personally identifiable information’ (PII) for example. This term is found in some key infosec frameworks, guidance and best practice documents such as those published by the US National Institute of Standards and Technology (NIST). However, it is not a generally recognised privacy term and is frequently used erroneously. Security vendors, managed service providers, auditors, recruiters and industry specialists should avoid using the term PII to describe information that identifies, or could lead to identification of, a person. Here in Australia, our Privacy Act 1988 and relevant state and territory privacy laws use the term ‘personal information’. New Zealand, Canada, Japan and China also use this term. Where security professionals are operating in the European Economic Area, Singapore or Brazil, the term ‘personal data’ should be used.

THE NEXT CHAPTER

The preoccupation of organisations and governments with data and privacy awareness across disciplines continues to grow in importance in parallel with increasing digitisation, particularly where there are shared interests, such as information security. Empowering the colleagues with whom we share experiences (and professional obligations) will ensure we are able to meet their expectations in years to come.

I have offered opportunities for vitalising privacy and celebrating its distinctiveness when security and privacy professionals share the table. Perhaps these opportunities can give rise to a larger discussion about how we can learn more from each other, compare dictionaries and refine our techniques for influencing good decision making.

~~~ An earlier version of this column first appeared on 1 January 2020 in a Demystify Cyber guest blog series curated by Amanda-Jane Turner, author of Unmasking the hacker: demystifying cybercrime.

www.linkedin.com/in/nicole-stephensen-privacymaven

This article is from: