Video transcript: Trust and transparency: Being good privacy managers
Title: Trust and transparency: Being good privacy managers, John Edwards, Privacy Commissioner
John Edwards, Privacy Commissioner
I thought I’d better come up with the goods for you, so, something new, I googled the topic, what is this privacy business? And, as you know, Google offers its suggestions based on billions of searches. When I started typing ‘Privacy is…’ Google thought I was going to say, ‘dead get over it’. Or is it ‘important’? Is it ‘theft’?
This is actually the second slide. I did another one that said ‘Privacy is a fundamental human right,’ and that’s my preferred answer.
It is. Privacy is relevant, it is alive. I’ve been hearing ‘Privacy is dead’ for 40 years.
There was a Newsweek cartoon cover with ‘Is privacy dead?’ in 1970. Privacy is not dead, let me tell you that. It is changing, and it’s becoming more complicated. And some of the reasons for that have been alluded to today.
We have smartphones that have transformed our lives, and have made it much more easy to access services, to get information, but they also know our every move. They know so much about us. And we can’t control that data. We are in an area now, you know we’ve moved from a series of transactions involving the volitional exchange of data for goods and services, where you make an assessment, a notional value proposition of ‘here is the service, here is the information which I have to give over in order to achieve that,’ and weighing that balance, through to a world that we move through and generate data unconsciously all the time. So that’s quite significant and involves some challenges. I talked about the smartphone and how, you know, we don’t even know what data we’re generating.
I heard from Apple, a couple of weeks ago I was at a conference, that, you know, different apps access different functions in this machine [holding smartphone]. There’s an accelerometer in here, does anybody know what that is? You’re a few steps ahead of me if you do. But, it is what tells you how many steps you’re walking a day, alright? It tells you whether you’re pointing north, and those kinds of things. But researchers have discovered that by using the data from the accelerometer in your phone, they can figure out your passwords. Because every time you tilt your phone to enter a new number the accelerometer is collecting data about the motion of that phone. So there’s data, a source that was put there for one purpose, and an application that could have significant security and privacy implications nobody ever thought of. It’s a by-product. So we’re in this world of unintended consequences. We have these wonderful intelligent cars that have become more intelligent. They know where to go and how we drive, and who are they communicating that with?
The internet of things, all these connected devices. Pacemakers, for example. There was a guy recently in the States whose pacemaker data was subpoenaed by the police as evidence that his story about his escape from his burning house did not stack up, because you wouldn’t see that pattern of cardiac rhythms from somebody in a panic, throwing their most precious possessions out the window and then following them to escape an accidental fire. He did it. And he did it very calmly.
One of the most significant challenges is that we have what both Martin [Matthews] and Lewis [Holden] adverted to, these enormous resources of data that we’re only beginning to understand and to tap. These are public assets. You know, right from the beginning of my appointment in my job I contributed to the public conversation about big data by saying ‘Not only is it OK that you use big data for analytical purposes and to understand and monitor the efficacy of your services and product, we as tax payers expect it.’ It is a public asset that is there to be used, and for the value to be obtained for the whole of the community. So have no doubt about that.
But, in doing that we need to do so deliberately and cautiously, and with full understanding of the possible consequences, the possible effects of algorithmic bias, and, you know, of distributive inequality.
There’s a wonderful example out of Boston. A local authority that created an app for people to report potholes. And they could just tap their app and the crew would come round and fill in the potholes and everyone would be happy. And the app showed that potholes predominantly existed in the affluent areas of town, and they were fixed promptly. Now of course in the poorer areas where the distribution of smartphone technology was far less even and universal, the roads were going to crap but nobody noticed about it and the resources weren’t applied to those. So you do have a tendency sometimes for a false confidence in the technology, for false confidence in the algorithms, and to build in some of these biases that entrench disadvantage in society. And those are privacy issues as well.
So why do you care about privacy? I mean I don’t care. I know you’re here, I know you care. You’ve invited me, I’m very grateful for that. You’re listening attentively after a wonderful lunch in a beautiful venue. I actually don’t care if you are listening to me because you want, you know, because you have internalised privacy values, and you agree with me that it’s a fundamental human right. You want your organisation to behave in the most appropriate way in accordance with that value system. I applaud you if you’re here for those reasons and I support you and am unlikely to come after you with my big stick. You could be trying to avoid adverse consequences, the reputational harms that I’m going to talk about in a moment. Absolutely fine, you might not care about privacy yourself at all.
You might be one of these ‘privacy is dead’ people. Or you could simply be one of many, you know, many people of the people in the audience will be saying, ‘Look mate, it’s great, love your work, but privacy is just one of a million things I’ve got to check in a box, and it’s like health and safety. Just tell me the minimum that I have to do.’ That’s fine too. We can do that. So, I don’t mind which one of those you’re here with, but I want to tell you that privacy is something that more and more people in the community are concerned about. Our polling shows us that. The media tracking shows us that. We see privacy issues reported daily. And we see, not only domestically, but internationally, we see this increasing trend that these concepts of trust and transparency and privacy are essential for getting the kind of productivity gains that the digital economy promises.
I was at a conference last year in Cancun in Mexico, OECD Ministerial on the digital economy. They said: trust and security is an essential precondition to obtaining the benefits of the digital economy. And respect for privacy is a essential precondition to the maintenance of that trust and confidence. I heard the same thing from the International Telecommunications Union. They had a conference on the same topic: security, privacy, and trust in a digital economy. That was in Tunisia. The World Bank is another. It has, you know, you wouldn’t expect it to be out there extolling privacy values, but it has said: privacy is one of the key things to get right in order to maximise trust in financial systems, in world banking systems. And to reduce regulatory drag it’s important that we have a common approach and a common understanding in the regulatory system.
So it’s important stuff I think. And we all sort of mishear and miscategorise privacy, you know, sometimes you’ll be thinking privacy means: I can’t tell anyone anything. Well that’s not true. Sometimes it does mean that.
Many of you might think privacy is about security. And you’re partly right. There are twelve information privacy principles. I’m not going to go through them all because you’ve probably heard those talks before. But one of them says your obligation as an agency is to take such steps as are reasonable in the circumstances to ensure the information is protected against loss, access, and misuse. And the consequences under the Privacy Act are probably some of the least of your concerns if you get that one wrong.
We see every week new examples of security failures. And in the public sector they have the potential not only to undermine public trust and confidence, but actually to undermine democratic institutions.
Let me tell you about a recent example in the Philippines. Seventy million voting records were hacked, accessed by a malicious agent.
We’ve got, you know, enormous loss of value to commercial enterprises through hacking operations. You heard of Ashley Madison? How many people here had to change their passwords on their Ashley Madison account? [Laughter from audience.] I can’t see any hands up.
I mean there you had an unethical and horrible organisation, but, you know, enormously valuable. That wiped $270 million off its value because of a failure to properly attend to privacy.
We have a real tension, we’ve got to get the value from data, we’ve got to embrace transparency in the public sense of increasing accessibility to large data sets, but we have to do that carefully. Otherwise, you know, we see problems like in Australia where medical data sets have been released and then re-identified, and public trust collapses. We’ve seen the same in the U.K. Care.data. This initiative again to release the value of data but, you know, doing so without first taking steps to ensure that the information couldn’t be re-identified. And if you want to think about risk, and I think you all should be, you’ll be thinking about how you manage risk, how you’re measuring privacy maturity in your organisation. Think about what can happen in your organisation.
And let me give you a couple of examples. I’ve taken to saying that you could spend hundreds of thousands of dollars, or millions of dollars, as ACC has done, for example, to improve its privacy maturity, to reduce the risk of the kinds of crises of confidence that a significant breach can bring. But the risk will remain in, I hate to say it, probably your lowest paid person. OK? Every week I see a new privacy breach, and it’s very seldom cohorts of Russian hackers. It’s more commonly the person from reception who has been asked to send off that letter. Who goes to the printer and picks up the document, accidentally scoops up the thing that somebody else sent a minute before, and sticks them both in an envelope and sends them off. I had a, you know, there was an example, if I could wave a magic wand and get rid of risk for you I’d probably do that by making all your excel spreadsheets disappear.
There was a wonderful example in a local authority where a guy rang up and said ‘I’d like to make a complaint please,’ and the telephonist said, ‘Well you’ll need to make a complaint on our form. You’ll have to fill it in.’ He says, ‘Well can you send it please?’ She said, ‘Yes, certainly.’ Helpful soul. She dragged and dropped the document into the email and sent off to that person the entire database of complaints that had been made to that local authority, including the full details of the person making the complaint, the complained against, the nature of the complaint, and so on. So these things can have a really undermining effect on people, and they can interfere with you getting those benefits from the digital economy.
Engaging online is a really efficient way of doing business. We know this. But if you can’t maintain people’s trust and confidence as you do it, then you won’t get those benefits.
One of the ways that you can approach the risk in this area is by undertaking proper privacy impact assessment, and understanding the threshold where such a process or methodology might be useful. So are you changing a business process in a significant way? Are you installing new technologies? Are you doing something in your organisation that is going to change the flow of personal information in and out of the organisation? If you are I think very often a privacy impact assessment will be a useful way of methodically identifying, assessing, and mitigating the privacy risks that those might involve.
We see risk also in the rush to realise value, I think, from our public data. I am, as I said, and reiterated time and again, I am very supportive, it is inevitable, it is a responsibility to get the value from that data. But let us be very careful about how we go about that.
We hear government talk more and more about this concept of social investment, which depends on the smart use of data. We saw an effort by the Ministry of Social Development to realise that policy objective and aspiration by switching the contracts under which the Ministry spends $330 million of taxpayers’ money in the community through non-governmental organisations. And the Ministry said to the non-governmental organisations: ‘We’re going to be doing this social investment. We want to monitor and evaluate the programmes. This is important. We want to make sure everybody’s getting what they’re entitled to, everybody’s getting the services they need, and you must give us client level data, for every person we see that we fund. Now if you don’t, you can still give them the services but we’re not paying for it.’ And that sudden change of policy caused some alarm for a number of those NGOs, who asked me to have a look at it. I undertook an enquiry and released the results a couple of weeks ago.
What we found was the Ministry was well intentioned. You know, there are a lot of conspiracy theories around, I think, in the NGO community. ‘This is just a tool to defund our services.’ ‘They’re wanting to crank down on costs.’ This was the kind of anxiety that was behind the NGOs’ misgivings. In fact, I don’t think that was the case at all. But the Ministry had failed to explain the purpose for which the information would be used adequately. That meant the NGOs could not adequately explain the purpose to the clients, and that was the trust relationship that was essential.
One of the aspects of that that I was concerned about was the fact that the objectives of the Ministry to improve social outcomes for everyone were laudable, but the cut-off to say, ‘We are not going to fund services for people who refuse to give their data,’ I thought that could actually lead to unintended consequences with the opposite effect. If you have people coming to Rape Crisis, or Women’s Refuge, whose first interaction is to be confronted with a form saying, ‘We’re here to help, and we’re going to pass your information on to the Ministry of Social Development,’ that might well drive some of those people away from that service. That was one concern.
But the other, because we know many of these services would continue to provide the service, even in the knowledge that they wouldn’t be funded for it, the other concern is that if you are cutting out a whole cohort of people who are frail because of their lack of trust in government, or feel vulnerable, and value their privacy while still wanting to improve themselves and lift themselves out of the situation that they’re in, then the data that is being used to inform social policy is skewed.
You have simply cut out of the equation a whole cohort of people. And I don’t think government could have intended to do that. I think the failing in this case was a failing of imagination. A failure of methodical assessment of the possible consequences of embarking on this policy. And I think a failure of engagement, because trust and transparency does require engagement.
If you want to have a look at this report, I mean, if you and your community or your management team is keen to be looking at how you enter into these data arrangements – I know that there’s a lot of place-based initiatives where coordinated services are being provided to specific cohorts of individuals, and that requires the sharing of information – have a look at our report. I think you might find it useful. It might help you to avoid some of the pitfalls. And we know that government more and more wants to bring your different services together, and get you to work together, and that just makes sense, and there is no impediment in the Privacy Act to doing that. And you can do stuff lawfully that doesn’t actually, I mean you can do it consistently with the law, but still in ways that freak people out and have these adverse consequences.
So I think business process design, system design really understanding what you’re trying to do is really important.
Transparency, from where I stand, is about ensuring that you are open with the people that you’re dealing with about what happens with their data. And we see different levels of engagement with this issue. We saw that MSD was not able to do that very well because it hadn’t, it didn’t have a very clear and plain story to tell the NGOs who could then repeat the story to the individuals about what was going to happen with their data and why, and what was off the table, for example. A failure to understand that, although I have no doubt, I do not doubt the motivations of that organisation, they are trying to help. Thousands of people in that organisation go to work every day to try and help people, but that is not the experience that many broken and isolated people in our community have of that organisation.
And understanding that’s really important, because they bring their history of experience to that engagement and they say, ‘OK, you’re the person who could take away my benefit, you’re the person who could remove my children, and you’re saying you need to take all my data, even though I just want some budget advice.’ So that transparency is important. We see transparency in a number of different ways.
Transparency reporting was something that we had a go at a couple of years ago. In the wake of the Edward Snowden revelations about U.S. intelligence agencies’ access to online platforms, databases, many of those platforms and tech companies began a process of transparency reporting. Which means that, at various intervals, annually or six-monthly, they will report the number of times particular government agencies come to them and ask for personal data. I think that’s really useful. It is, of course, not optimum, not optimal, because I think the government agencies should be reporting on that. They’re the ones with the data. But with the lack of that I think there is some value.
Trade Me does it, for example. What Trade Me does is it releases reports which shows that the number of times the Police, or the Ministry of Business Innovation and Employment comes looking for customer data. And every year, so you can go and have a look and see, OK the SIS twice asked for information. The Companies Office was asking for information. It does a couple of things. One, it acts as a sort of check on the enthusiasms of those agencies, and the other is that it can actually reassure people that the government agencies are not as voracious as you thought.
So we thought we’d do a trial and see whether we could support transparency reporting as a public benefit, and we got a number of corporations in the financial services, telecommunications, and utilities sectors to make a record over three months of the number of times that government departments came looking for the data. It was quite interesting. A really small segment of the economy, over a small time. We only had two banks I think in the sample, a couple of utility companies, one telecommunications company.
In that period, we found 11,000 times that government agencies were looking for data from, customer data from these organisations. Guess how many of those were SIS? I don’t know. I probably couldn’t tell you if I did. We didn’t collect SIS on that. But the vast majority, over two-thirds I think, were Inland Revenue. MSD was high up in there. Police were executing search warrants in about 2,000 cases, and asking agencies to voluntarily disclose in about a thousand situations.
So this is quite useful I think, quite valuable. And if we did expand it to intelligence agencies the community would see that, in fact, the fears of so-called mass surveillance are not as real as you might believe if you subscribe to certain blogs and other news sources.
I think I’ve got to catch up a bit of time, haven’t I? Do you want me to finish at three if I can? OK, another eight minutes. What will I tell you?
Here’s a picture of a disaster. We’ve all seen privacy disasters, I mentioned Ashley Madison, I mentioned ACC. ACC’s email fail cost it millions of dollars, a minister, a chief executive, a chairman, three board members, and a whole lot of consultancy fees. Now, they have improved. I want to say, I want to give a shout out, that it’s an organisation which has really transformed its privacy maturity and is now an industry sector leader. So they tire of being used as the example and the whipping boy, so in doing so I want to be even handed and say they’ve taken the years between that breach and now and really transformed them.
I mean, my mission, as I’ve said since I was first appointed, is to make privacy easy. If you want to hear about what you should do, what tools there are, I’m here to help. But I also have to play a sort of regulatory role, and I don’t have that many regulatory tools. I try and use the tools that I’ve got in the most effective way that I can. I’ve added a couple, just because the law reform process which I’m going to come to in a moment, is not moving at the pace that I think is warranted in this rapidly developing technological society. But I initiated a naming policy of identifying non-compliant practices, and publicly identifying organisations.
One of the first companies that I deployed that in respect of was Veda, which was overcharging people for their own credit information. Here is an organisation which, its whole business model is on using other people’s data, and then it’s ripping them off when they want to see what they’ve got. I thought that was morally and legally wrong, and so we’ve identified them and made it very clear what the law is, but you may hear a sequel to that, but I won’t say anything about that.
Another example that we had, and I’ll go through it because I think it’s quite an instructive one, one: because it goes right to the heart of human dignity. You know, we’re not talking here just about abstracts. We’re talking about actual people’s lives. And also, the case that I’m going to mention is important because it talks about the intransigence of a business system. And we’ve got to have, you know, the system is not there for the system’s sake.
The system is only useful insofar as it delivers for the people of New Zealand. And I didn’t name Immigration New Zealand in this to beat them up. This is not a shaming process. I wanted to illustrate this case to show some important values. And in fact, the facts of the case would have revealed the identity of the respondent anyway.
But we had a case where an Ethiopian refugee came to New Zealand as a young person, with inadequate records of identity and birth. The family member that came with him asked the local village for his birth date, they gave him a day, probably some confusion. When he gets here he starts getting older, he’s much bigger than his classmates, his peer group. He’s more mature. A bone density scan suggests that he’s probably three years older than the birth date recorded on the Immigration New Zealand information system. And there is in fact a consensus of health information that this guy is three years older.
Now when you’re fifty one, like I am, you know, forty eight, fifty four, doesn’t make much difference. But if you’re fifteen, if your official age is fifteen, but you’re really eighteen, that makes a huge difference to your ability to access a whole range of public services and rights in society. It’s voting, it’s educational benefits, it’s support from the state, it’s a whole lot of things. And the guy with his supporters and paediatricians and the like, went to Immigration New Zealand and said: under the Privacy Act you’ve got this information wrong, you’ve got the date of birth wrong, change it. And they said: no, you can’t change a date of birth. And, you know, there was this kind of Kafka-esque situation which this person found himself in. Everybody kind of accepted that the date on the records was wrong, but he couldn’t prove a date that was right. So, no change. So this kind of bureaucratic nightmare left this person in a limbo, left him in an ill-suited peer group. And we were able to achieve a satisfactory settlement of that.
You know we don’t have that many tools in our regulatory toolbox yet. When we get to the end of an investigation we refer a complaint to the director of proceedings who may or may not bring proceedings in the Human Rights Review Tribunal. And the Human Rights Review Tribunal can award damages. And for years and years it bubbled along, ten thousand here, fifteen thousand there, twenty thousand…it got to a peak, forty thousand damages in the 1990s from a woman called Paula Hamilton who was a model who checked into a place called The Deanery in Christchurch for treating alcohol dependency. And they thought: ‘Wow! English model! Marketing opportunity!’ And disclosed a lot of details about her. Pretty appalling. I think they went bust and never paid.
Anyway, we then came along to the infamous Facebook cake case – I don’t have a slide of it – called NZCU Baywide, a decision that was issued in 2014, in which a woman who was working for this company NZCU Baywide didn’t have a happy working relationship, ended her employment, had a tea party at home, made a cake, and iced the cake. And on the top of the cake she proudly iced: ‘Fuck you NZCU Baywide.’ What was that last word, Gary? And an even worse profanity to finish. She was quite proud of her icing, took a photo, posted the picture of the cake on her Facebook page. She had only very a few followers, friends, that she allowed into her Facebook space. But word got out, as it does in a small community. The boss at NZCU leaned on one of the junior staff and said: ‘Friend her on Facebook, I want to see what’s going on.’ The guy made a friend request, it was accepted, they got access to the photo, they screenshotted the cake, and then the HR Manager no less, sent photocopies of the cake out to all the employers in town saying: ‘This is the kind of woman you’d be dealing with, do not employ her.’ And the Human Rights Review Tribunal excelled itself and made an award of damages to her ultimately of $168,000, nearing the top of the then jurisdiction of that tribunal.
We are hoping to see some law reform. And let me just quickly tell you what we expect to see. Round the world we see an increasing trend towards mandatory data breach notification. That means when you stuff it up, when you do send that wrong email, when a hacker does get into your system and you detect it, you have an obligation to notify the regulator – i.e., me. If it’s a serious breach you may then have an obligation also to notify the affected person.
If you need evidence that this is necessary, think of Yahoo! who had two hacks. One of half a billion, one of a billion email records. I don’t think there’s much doubt that they knew about those hacks two years before the information got into the public. Now that is appalling. That means that those accounts were compromised, that a billion records were available on the dark web for two years. Whereas if they had notified those subscribers, people would have been able to change their passwords, take steps to protect themselves. Take the steps that Yahoo! had failed to take.
So I think that a mandatory breach reporting is going to be inevitable. I hope to see, also, compliance notices giving me a few more teeth.
I have also been saying, in response to criticism, that I’m a toothless watchdog. I typically say I can assert quite a lot of pressure with my gums. And it’s true, I think some people have experienced that. But it’s not enough in this international environment where data knows no bounds, no borders. I’m hoping that we’ll see compliance notices, meaning some effective and timely mechanism for me to enforce the Privacy Act. Access to terminations will be a similar mechanism where I should be able to assert some effect, you know, give effect to people’s rights.
I’ve also gone back to the Minister of Justice this year, well at the end of last year, and said, you know: The Law Commission report on which your commitment to law reform is based came out in 2011. A lot of things have changed in the last six years. Let’s have another look at fines. And the Minister has agreed to have a look at giving me the power to apply for fines in the High Court and I welcome her engagement with that. We’re looking at a number of other issues, such as a policy about prohibiting re-identifying de-identified data. So if we want to put data sets out, if we want to get the benefits of big data, how do we deal with the risk that in six months’ time there might be technology to take two de-anonymised data sets, put them together, triangulate, and re-identify people. There’s a whole lot of things, I heard Liz McPherson invoked before. She is taking leadership in this area, but I wonder whether there is also a role for a sort of backstop.
If all the standards and the guidelines and the leadership fails, do we need a regulatory prohibition on misuse of data of that sort?
I do want to support you to do the right thing. You will find on our website an enormous range of resources, from the case-by-case Ask Me, if you’ve just got a question that comes up in a compliance sense, go to our website, privacy.org.nz. We have a knowledge base that is actually sector leading, and it is improving all the time. You ask a question, you will go to an answer. If you don’t, we will harvest your question at the end of that week, we will write an answer. So the next time you come back to it, it will be there.
So we’re learning. It’s not quite as intelligent as I’d hoped when I commissioned the work but I was told that what I wanted didn’t exist, and I would have to pay half a million dollars, and I wouldn’t have been able to justify myself to Martin [Matthews, Auditor-General] if I’d done it.
So we’ve got second best, and it’s pretty good, and it’s getting better all the time. So use it. Also, when I came into the job I found that we were training people on privacy, and we’re really good at privacy. We know the law, we’re full of lawyers. But we don’t know much about education. So we partnered with LearningWorks in Waikato to put all our materials into a educational pedagogical philosophy. So they know education, we know privacy, we come together, and we’ve now got these free tertiary-level courses that you can ask all your staff to do, and raise your privacy maturity.
When I come knocking with my gums, and say, ‘What have you done to ensure privacy?’ You can say, ‘We’ve put everyone through your course.’ We’ve launched a whole lot more this week. The Privacy Act covers the whole economy, a vast range of activities. Many of you will have very little in common with each other. The one thing you will all have in common is that you will all have employment.
And there are common issues with employment and privacy, and we’ve just launched our privacy in employment guideline. So I urge you to go on [our] site and have a look at it.
Title: For more information and to download presentations, visit www.auditnz.govt.nz