Q&A with Privacy Commissioner John Edwards


Before he heads off to the UK, Shenagh Gleisner talks with Privacy Commissioner John Edwards about the challenges he sees to having a good privacy system.

What is the biggest challenge to privacy that emerging technologies have created in the last few years?

The biggest challenge facing people working in the public sector is to make the most of emerging technologies, such as artificial intelligence, facial recognition technology, biometrics, encryption, and digital ID, without causing harm.

When implementing this sort of technology, there needs to be certainty that privacy concerns are factored in from the start. I am concerned about vendor-driven technologies that can be solutions looking for problems. New technologies should be subjected to rigorous privacy impact assessment before implementation, and if you are commissioning your own software builds, insist on privacy by design.

A good example of effective design is the Integrated Data Infrastructure run by Statistics NZ. The quality of the design shows the value of public datasets for research.

What other challenges are there, current or future?

There is an ongoing cycle of demand for more and better information sharing, but unfortunately, it is sometimes ill-disciplined and poorly thought through. The best way forward is to ensure you have a clearly articulated business case that spells out exactly what information you need to share, why you need to share it, and who will have access to it. With an increasing amount of data being recorded by the public sector, a good point for people to keep in mind is this: just because you can share information doesn’t always mean you should.

Encryption is an essential, privacy protective technology, but it is also a growing challenge for law enforcement and the intelligence community. This is a worldwide problem, and New Zealand will ultimately be a “taker” of the solutions developed in other jurisdictions.

You have been very busy responding to privacy issues raised by COVID. Has COVID changed the game from a privacy perspective, or has it accentuated issues already present?

I think, in general, New Zealanders held a genuine respect for privacy as a starting point before COVID. Take the shared benefits of people using the COVID app, for example. We tend to trust that if we share our personal information, it will be used for the reason it was collected and nothing more. There is that sense of a social contract, and the community’s trust is key. Likewise, the pandemic brought to light how the government’s respect for people’s privacy is crucial to winning their trust.

Our shared response to the disease has seen parts of the public sector come together to achieve an immediate shared goal, which is particularly seen in the public health service. It’s been great to see so many public servants have confidence that New Zealand’s privacy framework is flexible enough to achieve their policy objectives. The general rule of thumb is that privacy rules set the framework for how personal information can be shared, rather than preventing sharing outright.

I think you are beginning a project on Te Tiriti cultural perspectives and what this means for privacy in Aotearoa. What do you think are some of these unique Māori perspectives that will shape approaches to privacy in the future?

We need to think about how the privacy framework can be used to help Māori achieve their objectives. There is a real challenge and opportunity to frame aspects of data protection through the communal, rather than individual, perspective te ao Māori brings.

There is a growing consciousness of Māori data sovereignty, which sees indigenous assertions of rights over data, with government in a kaitiakitanga role. There is room to recognise collective, as well as individual, rights, but we need to work through this carefully in full partnership with Māori so that we understand and properly discharge our Treaty obligations.

People, processes, and culture matter so much for privacy. A good data culture in the public service? Give us a mark out of 10!

I’d love to give us all 10 out of 10 when it comes to safeguarding people’s data, but the truth is there are pockets of nine and pockets of three out there.

The public sector is constantly developing; technology is changing and advancing, and the best thing we can do is keep privacy front-of-mind. Protecting people’s private information requires constant vigilance. We should think of it in the same way that we continually monitor health and safety in the workplace.

You have handled the Waikato DHB issue and, in particular, have expressed outrage at Radio New Zealand. What is your overriding message to public service leaders from this experience?

The hacking of sensitive patient data from Waikato DHB, which was then dumped on the dark web, was one of the biggest breaches of privacy ever in New Zealand. I was very disappointed that Radio New Zealand then saw that information as a legitimate source for news stories.

The lesson for the public service is to really prioritise cyber-security and a culture of respect for the information it holds. When things go wrong, you need to take steps to minimise the damage. In the case of the Waikato DHB, that would have involved seeking court orders preventing others accessing and using the stolen material as soon as they learned it was publicly available.

The other message is not to underestimate the potential for harm. A cyber-attack that immobilises a large health provider is literally a matter of life and death. Recognising the gravity of the consequences should inform discussions about how security and a privacy culture are resourced.

The best solution is for privacy needs to be part of the planning of any new or updated product, service, system, or process. Privacy considerations should help drive the design from the start to help ensure broader protection rather than being loosely bolted.

Thinking about the performance that must improve in this area, what do you believe is the driver of the poor practices you see?

Apart from the underinvestment in IT I’ve referred to, I think that historically there has been too much of a compliance approach to privacy – a check-box culture.

We are really talking about values – of respecting the people we are serving and the information that has been entrusted to us. If we can succeed in internalising those values in the organisation, and in our staff, legal compliance is more likely to follow.

What is the area that attracts the biggest number of complaints and concerns about public service handling of private information?

The most common area of complaint is when people are not able to access their personal data because it is being blocked or delayed. The system has often been made to suit the organisation holding it, not the public wanting to access it.

What are the most common breaches and what is vital to put in place to reduce these breaches?

The most common types of privacy breaches come simply from carelessness with emails. Too often we hear about people putting the names of others in the CC (carbon copy) section on an email not the BCC (blind carbon copy) section, so everyone can see who got the email.

This sort of breach is both a human carelessness problem and a design problem. IT managers can configure systems to reduce privacy problems, perhaps by adjusting the layout of the software or by adding warning prompts. This is part of privacy by design.

In terms of numbers of breaches, in general, we have seen an increase since the Privacy Act 2020 made it mandatory to report any breaches. Mandatory reporting means telling our office as soon as practicable if there’s been a serious privacy breach. It doesn’t mean telling us after the dust has settled.

You are off to the UK – are there aspects of the privacy environment in New Zealand that you will be hoping to take with you?

I will listen carefully to understand the UK’s experience of their privacy law. I have the impression that they feel that their privacy environment has been imposed on them by the European regulatory system. In that sense, it is an exciting time to be supporting them in deciding what will be the best system . I would always encourage openness and transparency, but I don’t have preconceived ideas of how I will carry out the role in practice.

I hope I will be able to take the Antipodean pragmatism that responds proportionately to regulatory challenges.

Imagine you come back to Aotearoa New Zealand in five years’ time and there is a dramatic improvement in the approach to privacy in that time. What would you see?

I know what I’d like to see: all data systems designed around the citizen, not around the organisation. Citizens should be in charge of their data, feel trust in it, understand how it is held, and feel empowered to make informed decisions about how it is shared.

Let the people decide what happens with their personal data and they might well give permission for it to be shared in ways that organisations too often try to prevent.

Do you have a personal message to New Zealand public servants before you go?

It’s been an enormous privilege and pleasure to have been Privacy Commissioner for the last seven and a half years. I’ve had a lot of support from across the public service. I think we can be proud of our system of public administration. It is full of people really trying to change things for the better. I’ve found that things work best when the different agencies and actors understand and respect the different roles we all play, whether we are ministers, members of parliament, statutory officers, or public servants. We have really strong institutions, and that allows us to have difficult conversations and disagreements, without compromising or undermining the integrity of those institutions and systems.


Share