IEEE Digital Privacy Podcast Series: Episode 7

 

Greg AdamsonA Conversation with Greg Adamson
IEEE Society on Social Implications of Technology VP Technical Activities

Listen to Episode 7 (MP3, 27 MB)

 

Part of the IEEE Digital Privacy Podcast Series

 

Episode Transcript:

Brian Walker: Welcome to the IEEE Digital Privacy Podcast Series, an IEEE Digital Studio production.  This podcast series features conversations with industry and academic leaders, as well as key stakeholders of digital privacy, in order to help advance solutions that support the privacy needs of individuals.  In this episode, Greg Adamson, a Chief Information Security Officer in the Australian Transport sector discusses the evolution of technology, regional differences, and standards development efforts underway to help advance the digital privacy space.  Greg, thank you for taking time to contribute to the IEEE Digital Privacy Podcast Series.  To get started, can you please introduce yourself and share a little information on your background?

Greg Adamson: My name’s Greg Adamson.  I’m based in Melbourne, Australia.  I’ve been volunteering with IEEE for about the last nearly 20 years and I’m a past president of the IEEE Society on Social Implications of Technology.  I am also an active volunteer with the IEEE Standards Association, working onto industry connections.  One is meta issues in cybersecurity looking at doing cybersecurity better, and the other is dignity inclusion, identity, trust and agency, looking at those that prevent technology being adopted effectively and respectfully.

Brian Walker: So Greg, how did you first become involved with digital privacy?

Greg Adamson: So the first time I was looking in detail at privacy was in the late 1990s when I was part of a group called the Internet Societal Task Force, with in the Internet Society.  It had been set up by Vint Cerf, and we had a number of discussions about privacy and what privacy meant, and I put up my hand to collect together all these ideas because people had been considering privacy from a technical point of view, a legal point of view, the evolution of privacy and so forth, and I thought, “There’s a lot here, and I’ll collect it all together and I’ll provide it to people so they can think of it in a single way and work out what to do next.”  So I collected together all the references to privacy in the-- over the discussions of the previous few months in Internet Societal Task Force and what I came up with was a list of more than a hundred-- I think it was 104 different ways in which privacy was being considered in the discussions, and I regret to say that having done all this work and presenting it to-- presented it to everybody, the effect of this was to chill the discussion.  People looked at this list of 104 things and said, “Oh, goodness me, what can we actually do?  This is too complicated,” and they moved on to another discussion.  So it was-- that was back in 1999 and that’s when I realized that privacy not only means different things to different people and has different areas of importance for different people, different communities, different cultures but is just incredibly complicated and that is a problem.  It’s a problem that it’s complicated but it’s also a problem that when people approach the area of privacy, quite often as they’re going up their learning curve, they’ll start off thinking, “I think privacy is important because people should-- people’s privacy should be respected.”  Some sort of simple principle like that, and then they discover that there’re so many layers and so many aspects and there are so many discussions and everything they talk about has probably been considered for hundreds or thousands of years and it becomes overwhelming and it becomes difficult to frame a conversation about privacy.

Brian Walker: How would you view the evolution of digital privacy, say, over the past 20 years or so?

Greg Adamson: So over 20 years the issues haven’t changed.  By and large, people are concerned about-- in all cultures people are concerned about privacy.  They may not use the word privacy.  They may have a different tradition.  People may come from a culture where everybody lives in a different room or a culture where all the family lives in one room, but nevertheless, the idea of a person having access to private-- to their private space, their private thoughts, their ability to protect their communication with people who are important to them, that’s common.  You find that everywhere.  The difference-- the main change that’s occurred over the last 20 years is in the technology space.  It’s not that technology wasn’t contesting privacy 20 years ago but that 20 years ago we had a lot of accidental protections.  I’ll give you an example.  If you went into a shop 20 years ago and there was a video camera, probably the video camera was making a tape of the people in the store.  Probably there was a very large chance, maybe even a 50 percent chance, that the camera wasn’t working, that the tape hadn’t been loaded, that the system had broken some period before, and so I’m not saying that’s a good thing.  If you had a robbery you wanted to have evidence of the robbery, but it just meant the privacy was created by the accidental inefficiencies of technology and the accidental inadequacies of technology.  What we’ve seen in the last 20 years is that those accidental inadequacies have been systematically removed, so today the camera doesn’t store tape at the local shop.  The camera is probably transmitting the information in real-time to a data, to some sort of data server in the cloud or elsewhere, which is very efficiently recording that and backing it up, and the possibility that you simply-- that you won’t catch millions and millions of images now because the equipment isn’t working properly.  That possibility is pretty well gone, and so what that means it today we need to take full responsibility for what our technologies are doing when they’re collecting data, whereas in the past we could say, “It doesn’t really matter.  It’s probably not working,” and so forth, and I’d say that’s the biggest single change.  We now have technologies becoming omnipresent in a sense of completeness.  Closure has been achieved by technology.  The only time that we will now have privacy is when we choose to have privacy.  So if we choose to have privacy at the ballot box, then we have privacy.  If we choose to have privacy in the bathroom, in a public toilet, then we have privacy.  If we don’t choose to have it, then everything-- then privacy does not exist anywhere anymore.  It’s a little bit like our choice to have national reserves for forests.  Today all forests would be chopped down if we didn’t say, “We will protect this forest.”  In a similar way, all privacy would be gone if we didn’t say, “We choose to protect this privacy.”

Brian Walker: Greg, you’ve touched upon it, but can you speak in more detail about the regional differences related to digital privacy?

Greg Adamson: Continuing my previous point, I think what we see within Europe is a more consistent application of intention to protect privacy, but I don’t think there’s any fundamental underlying difference between Australia, Europe and the U.S.  People have things that they want to keep private; people have things that they don’t care about.  In Europe, through GDPR, and that is a basis for other legislation, Europe has systematized certain conditions which provide a basis for privacy.  So, for example, when your data is collected by a corporation or when you voluntarily give data and somebody’s holding that data, what are their responsibilities?  For example, it could be your health data.  In the EU we have a fairly systematic approach there.  If we look at the U.S., what we find is there’s more of a patchwork of approaches to this.  So they could be a state-- California, for example, is very active in considering the sort of things that Europe considers through GDPR.  The other privacy rights will be protected under because it relates to health data, for example, but in the United States it’s a patchwork.  It’s a patchwork between states and federal and it’s a patchwork between purposes of data, it’s a patchwork between methods of collection of data and it’s less systematic, and I think that makes it a bit more confusing and a bit more difficult to provide a simple way of understanding the protection of privacy.  In Australia, we’re somewhere between the two.  We don’t have legislation such as GDPR, but we do have our various privacy acts at the federal and state level.

Brian Walker: Greg, what’s currently driving digital privacy or digital privacy initiatives Down Under?

Greg Adamson: So in Australia we had-- many decades ago we had a controversy over the introduction of the single identity card, and for various reasons at that time Australians felt that that was a bad idea.  There were large demonstrations in the streets and a lot of concern on talk-back radio and so forth, and at the time the government took a strong decision not to create a single identifier and where there were single identifiers, such as our tax file numbers, to have those controlled in a very strong way.  Since then, not much happened until 2022, the last 12 months, and then in a period of about 6 weeks we had two data breaches, one related to one of our-- a major telecommunications company, and the other of our leading private health insurance company, and in each case, the data loss equated to about 40 percent of the population of Australia.  So each of them involved about 10 million records on Australians out of a population of about 25 million, and that was an enormous wakeup call.  Suddenly people were highly concerned if in relation to the medical data, for example, this created high levels of concern specifically about medical privacy, and so-- and these two events occurred in late September and October, last year, so just three or four months ago, and since then the environment has I think transformed would not be too extreme a term and all organizations in Australia, all large organizations in Australia, are now starting to think about the data that they’re holding and not just how can they protect the data but whether or not it’s a good idea to hold the data.  I’ll give you an example.  If you provide proof of-- if you provide evidence in order to identify yourself when you take up a service, by definition the documents that you’re providing as evidence, for example, your passport, your driver’s license, those can be used.  If somebody gets hold of those, they can then use those for identity theft because they are documents that identify yourself and so by definition if someone takes those documents, they’ve stolen your identity and they can commit identity theft against you and rack up credit against your name and all of those things that come out of that, and so the question is once an organization has onboarded you, why do they need to keep that data?  Why do they need to keep copies of that information?  They’ve proved who you are.  They know who you are.  Why should they continue to hold that information, especially since if they didn’t lose that information-- this happened in the case of the two organizations I mentioned before-- then they’re-- then they suffer enormous cost and enormous public embarrassment.  So all in all I’d say that the awareness of privacy issues in Australia is the highest-- at the moment is the highest that it’s ever been, and many organizations are looking for ways to address that.

Brian Walker: Greg, you mentioned in your intro that you’re involved in standards development.  Can you tell us a little bit about the activities or initiatives that are underway in this area?

Greg Adamson: So one concept that I like a lot is the concept of privacy-enhancing technology.  The privacy-enhancing technology or PET is an umbrella term to describe different technology approaches to address privacy in different circumstances.  So it’s not a single technology.  It’s not a single platform, like, say, a cloud, but it’s a group of all technologies which fall into this category, and IEEE has been doing quite a lot of work on this over the last few years under its P7000 Series of standards, which originally looked at the ethics of artificial intelligence and applying artificial intelligence in a meaningful, helpful, useful way, not in a negative way, and P7012 is a standard on machine readable personal privacy terms.  So that’s the idea that when I go to a website I can, instead of being confronted by a 50 or 100-page document, which I’m not meant to read, saying that the website will sell my data and do anything they cared to do without any-- without me having any say.  Then I can just take it or leave it.  The idea of the machine readable personal privacy terms is that the-- my machine, my computer, laptop, phone, whatever, can hold an-- it can act on my behalf in negotiating with the website and say, “I do not want you to store anything about the session,” or, “I have given you information in order to subscribe to a newsletter.  I don’t want you to sell that, sell my email address to anybody else,” or, “I have booked a medical appointment.  I don’t want you to monetize that, my medical information,” and so forth, and this is an idea that’s been around for quite a long time, since the earlier 2000s, and Doc Searls, who’s well-known in the technology community, has been advocating this idea, and then a few years ago we found the opportunity for SSIT, Social Implications of Technology Society, to work with Doc Searls and his colleagues to create the P7012 standard.  Now, a couple of other-- and also the International Standards Organization is interested in reviewing that standard when it’s produced, so it may be a very wide-- possibly that could be a very successful standard.  Another area that we’re looking at in relation-- or that I’m looking at in relation to privacy-enhancing technologies, is in the area of personal data stores.  So the concept of a personal data store, it’s a little bit like a personal cloud, but with a personal cloud often the model is that the cloud provider monetizes the data that you store in your so-called personal cloud.  The idea of a personal data store is that you have the keys to your personal data store and nobody else does, so you can decide when that should be.  You decide you want to share your data with somebody, you want to share your data with somebody for a fixed period, you want to share your data with somebody and not have them reshare it, you want to share it with them and not have them print it and so forth.  So it’s quite easy-- once you have a platform such as this it’s quite easy to place conditions on the sharing of data that can then be technically enforced, and the whole idea of-- the whole question behind this approach is can we-- can companies provide data stores that aren’t based on monetizing the data?  And through the personal data store some work I’ve been doing with colleagues in Indonesia on personal knowledge containers, some work I’ve been doing on the storage of health data in Australia, the-- I would say that there’s a very strong opportunity to standardize this concept of personal data store and that’s something that I’m hoping will happen in the coming years.

Brian Walker: Thank you for listening to our interview with Greg Adamson.  To learn more about the IEEE Digital Privacy Initiative, please visit our web portal at digitalprivacy.ieee.org.