IEEE Digital Privacy Podcast: Episode 12

 

Matt SilveiraJohn WunderlichA Conversation with Matt Silveira, Privacy Practice Consultant, and John Wunderlich, Chief Privacy Officer, JLINC Labs

Listen to Episode 12 (MP3, 41 MB)

 

Part of the IEEE Digital Privacy Podcast Series

 

Episode Transcript:

Brian Walker: Welcome to the IEEE Digital Privacy Podcast Series, an IEEE Digital Studio production. This podcast series features conversations with industry and academic leaders, as well as key stakeholders of digital privacy, in order to help advance solutions that support the privacy needs of individuals. In this episode, we speak with digital privacy experts Matt Silveira and John Wunderlich. The two discuss current standards and initiatives in the digital privacy space and provide their insights on the evolution of implementing real-world solutions to digital privacy. Matt and John, thank you for taking time to contribute to the IEEE Digital Privacy Podcast Series. To get started, can you share a little information on your backgrounds?

Matt Silveira: I’m Matt Silveira. I’ve been working in the privacy space for the last 10 years, and prior to that I’ve been focused on cybersecurity. I work as an adjunct instructor for local community colleges here, such as Los Rios, and also University of California, Davis, in the past, and I was the chair of the P7002 standard that was passed a couple of years ago with IEEE, and I’m now a member of the IEEE Privacy Initiative.

Brian Walker: And John, what about yourself?

John Wunderlich: Like Matt, I was his co-chair for the P7002 initiative, I’ve been in privacy since about 2004 in various roles, starting corporate, landing briefly in regulator and government health positions, but I’ve been a consultant since, oh, going on 15 years now. Before that, I came out of IT operations in a corporate environment, and I’m a member of a number of initiatives, including NGOs like MyData Global, for which I was a founding board member, the Kantara Initiative, where I lead a work group trying to create privacy-enhancing mobile credentials requirements, and so forth and so on. Generally speaking, just a privacy curmudgeon.

Brian Walker: So let’s jump into the questions, and Matt, maybe you can take this first question. How is digital privacy being prioritized, and what types of initiatives are taking place currently?

Matt Silveira: Digital privacy right now, in its prioritizations over the past five to seven years, has really been centered on moving from a bespoke, customized governance and compliance, and sometimes potentially engineering reviews, to a more operationally organized and orchestrated approach, where there are dedicated teams and processes and workflows to ensure that proper privacy engineering is in place, proper reviews are done early in the product and service development lifecycle, and more important, that the organization has the ability to track those controls and compliance and demonstrate that the product is fully functional with those privacy enhancements and safeguards in place.

Brian Walker: John, do you have anything to add?

John Wunderlich: Guess the only thing I’d say, I agree with Matt, I’d just encapsulate it this way. Privacy is moving from performative and compliance to performing and performance, and that’s a huge change that reverberates through any organization that’s processing personal information.

Brian Walker: So John, digital privacy’s been defined in several different ways. What are your thoughts on explaining digital privacy in context?

John Wunderlich: Well, I’ll start with context. Privacy is a very loosely defined word and very much it depends on both what you mean by it and what a person expects, depending on context. My context for privacy when I’m out on the street or on a Friday evening in the entertainment district is a lot different than if I’m going to an Alcoholics Anonymous meeting in the basement of a church. My expectations in the context are radically different. That hasn’t yet translated effectively into the digital realm where we speaking more specifically, usually about personally identifiable information about a person and how it gets collected, used, and disclosed by various entities, and in that space I think the context really should be defined by my personal approach to explaining privacy to people, which is a person has privacy when they’re able to determine what information they share about themselves, with whom, and under what terms. And when you phrase it like that, that becomes an operationalizable, if that’s a word, way to approach your functional and non-functional requirements for designing a system that’s going to process personal information.

Brian Walker: What are some of the key challenges that you see in instituting digital privacy?

Matt Silveira: Yeah, I think it’s a great question. From a perspective of working with my clients in my consulting business, I found that the first key challenge is helping them build a workflow and a model that’s sustainable and manageable in terms of a business approach that suits their business, and more importantly, suits their targeted customer or service target. In that way, one of the challenges is how do you make that review process effective, not just a check-the-box exercise which, unfortunately, a lot of organizations are still trapped in that conundrum. Second key issue or key challenge would be to effectively translate those requirements into a functional, usable product that meets the market expectations. In other words, I don’t think anybody wants to have a creepy smartphone that requires me to constantly turn off privacy-intrusive options that automatically get turned on every time I update the app, and then last but not least in these key challenges is the ongoing maintenance and management of privacy programs. They tend to get leadership’s attention when there’s a crisis and then when the issue’s been tamped down it’s no longer top of mind. The funding gets quietly turned down. The leadership and commitment to staff gets reduced, and that’s, I think, another key challenge there. John, do you have any other points you’d like to hit on that?

John Wunderlich: I think one of the big challenges is getting organizations or staffing organizations to disambiguate privacy versus confidentiality or secrecy. Too often those two things are confused in technical or security purposes. Well, we’ve got privacy because we have a good security program, and that kind of elides the main point, which is that from my perspective, security is a responsibility of a person in organization to protect the organization, whereas privacy is the responsibility of a person or people in the organization to respect the privacy of the individuals whose data they process, and there’s obviously big Venn diagram overlap and what that means in actual work done in the organization, but it’s a significantly different lens to look at the two.

Brian Walker: So you both mentioned in your intro standards, in particular IEEE 7002, I believe, so can you get into a little bit of detail about the goals and the objectives of that standard?

John Wunderlich: Well, I’ll start, but I should defer to Matt on P7002 because he led that initiative, but I’m active in a number of standard spaces and one of the things that standards give you is a common vocabulary, and in the absence of a common vocabulary to discuss things you often end up with people shouting across chasms using the same words to mean entirely different things. So the first thing that standards do is provide that common vocabulary and also an analytical framework for thinking about things. So in the ISO, they have a privacy framework and privacy architecture, a number of standards, 29100, 29101 and a number of others, all in the same work group that holds security. So once you go down that road, then it becomes easier to recruit, easier to train, and easier to have consistency across those domains in your organization.

Brian Walker: Matt, do you want to continue with that?

Matt Silveira: Yeah, I do. Thanks. Back to a point John was making earlier about context. When a lot of organizations I work with struggle to find and understand what context is, standards are an amazing tool to help them along that journey of discovery and how they should look at privacy not necessarily from their centric view but from others, as John was explaining earlier. The other issue, or I should say benefit, that the standard provides is the ability to have something that is traceable back to your organizational standards. We’re not suggesting a completely prescriptive model that you just wholesale adopt or not. That’s never been the approach with standards, I think, in general, but it certainly allows that crucial traceability so that when, as John mentioned earlier about people’s various levels of understanding and vocabulary, you can point to that reference immediately and ground your position in that, and then, last but not least, the standards help provide some of the other toolkit and explain what the tooling is in terms of putting together a digital privacy program in an organization. Because a lot of that, to the uninitiated or to the newly oriented, appears complicated and appears sometimes like it has to be done with expertise outside of the organization, but we really thought with 7002 it was an approachable standard to help people move into that hands-on, actionable digital privacy program that organizations need.

John Wunderlich: If I can just extend that. Talking about context and standards, if you take three contexts for digital personal information, there’s health, and there’s finance and banking, and there’s social media, and in the first two those are highly regulated and there’s also standards that are not privacy standards but that the people who process the personal information are bound by. So in health privacy, for example, very sensitive information, but at the same time there may be forced disclosures for particular kinds of things. In Toronto, I’m a Canadian, I believe this is fairly standard, but if somebody shows up in a hospital in Toronto and they show that they have tuberculosis, that is reported to the public health authority. That’s a reportable condition. Similarly, gunshots reported to the police. So your privacy takes a back seat to public health or public safety in those things, and doctors have an enormous amount of discretion in overriding your privacy preferences for health delivery purposes. At the same time, in finance, you’ll get forced disclosures because of money laundering rules or AML and things like that, but there’s no authority for your banker to decide what’s best to do with your money, and then finally, with social media, has no standards, <laughs> and there doesn’t appear to be any likely ones to be on the horizon, so the only thing left standing in that context is privacy law.

Brian Walker: And we talked about this on our call, our initial call, so privacy is handled differently across the world. So how do you overcome regional and geographical challenges or differences related to digital privacy? And John, this is for you.

John Wunderlich: I’m not sure that I agree with the framing of that question because it presumes that you should overcome that. Countries have sovereignty, so you wouldn’t say necessarily, you wouldn’t frame a question about tax law in that way, right? How do we overcome the differences in tax laws? And I used to work in a corporation that did this, so in the United States there are in excess of 4,000 taxing authorities when you include federal, state, and county taxes, and they’re all done separately. A real income boost or nightmare for accountants, but it recognizes the sovereignty of the authorities to make those decisions. Similarly, different cultures, different approaches to personal information. Interesting conversation that was reported to me recently, somebody who was in the UK for a conference, and the person from India was there talking about their recently passed data protection law and replacing they used the word data principle to refer to the individual, and there’s a number of other things. There’s a number of other differences, and some of the Europeans in the room were reportedly castigating India for diverging from the, quote, “gold standard,” unquote, that has been set in Europe with the general data protection regulation. So two things there. I think, because Matt and I have talked about this, we both think that the GDPR is the last best privacy or data protection law based on a conception of computing that comes from the 20th century, and it’s increasingly obvious it’s not fit for purpose for 21st century uses, and the other thing is, which, as was pointed out to me, was that there was a certain colonial or historical view of, “Well, we’re Europe. We get the right to determine what’s correct.” So just on the whole thing of should there be a universal data protection law? Probably not. We have Universal Declaration of Human Rights, which has Article 12 about privacy and the rights to the person, and that’s probably sufficient. I think each jurisdiction has to balance its own culture and history and capabilities to determine what’s right for it, and then those of us that have worked for global corporations have the interesting challenge of how do you tailor a global operation to meet those regional and reasonable requirements?

Brian Walker: So on a global basis then, how do best practices come into play as it relates to digital privacy?

Matt Silveira: I think that’s based on your perspective. If you’re a business, you certainly are looking at best practices. You are looking at ways to identify overlap to optimize your privacy program. You are looking for ways to enhance your market acceptance or even regulatory criteria that bar you from a market. No question. On the other side, right, from the privacy regulators and from a consumer protection perspective, I think, as John was pointing out, there’s still a strong presence of regional or sovereign perspective about unique requirements and unique regulations that are specific to Europe or the United States or India or China or any other nation that you might name off that come to mind, and that is something that is a significant challenge today. And tracking those things and making sure that your product meets that minimum viable capability for privacy is a significant hurdle.

Brian Walker: Yeah, makes sense.

John Wunderlich: And I will say, yeah, I’ll just add this, I’ve provided advice or worked in Europe, North America, and Asia, and the one thing that is unanimous is that privacy, and implementing privacy operationally, there’s some fuzzy rules. It’s not cleanly demarcated what you can and can’t do, and when I’ve talked to engineering teams in Shenzhen or in the valley or in Europe, they all speak the same language. Don’t talk to me about fuzzy rules. I want checklists and clear demarcations, and privacy doesn’t work that way because people are fuzzy. So it is and will continue to be a challenging engineering problem.

Brian Walker: So I know both you gentlemen are familiar with the IEEE Digital Privacy Initiative. What do you see the role of that initiative being to help advance the technology space?

Matt Silveira: Yeah, I see the role of the initiative as three things. First, informing subject matter experts in a particular area, like connected vehicles or healthcare or energy, with a deep knowledge base and other like-minded professionals, as well as subject matter experts like John, provide access to those types of people and information to help them get a better grasp of the challenges of implementing privacy. The second benefit is that it helps level the playing field for privacy and differentiate it clearly from things like cybersecurity, because John made a point earlier in our session about that, and we help a number of people as they’re joining our group and going through this journey to explain those vital differences. And then last but not least, we’ve been creating a number of tools and other informational graphics and training and white papers that help our audience get a better understanding of not just how to implement privacy in their space, but other privacy concepts that approach the ethics space and approach things like AI. So it’s a very wide group in terms of their knowledge and interest level, and I think that that goes a long way to helping get privacy better established in organizations’ mindsets.

John Wunderlich: I might say this. One of the first artifacts out of the Digital Privacy Initiative has been some good research on expectations of privacy, and that’s, to channel my Harvard friends, that’s a wicked hard problem, because the impulse to privacy may be built into humans but how that’s articulated varies widely. In the United States, for example, if you don’t close your window, you don’t have an expectation of privacy. As I understand the United States’ approach, it’s pretty black letter. If you don’t take active steps to protect your privacy, you have no expectation of privacy. Whereas in Canada, we had a recent Supreme Court decision that said your expectation of privacy depends on a whole raft of features, like where you are, who’s observing you, what you’re doing. It becomes very nuanced. Even if you’re in a public or a semi-public space, that doesn’t mean you’ve necessarily sacrificed your expectation of privacy, which goes back to the hard work of operationalizing privacy where you’re operating and understanding the expectations of the people whose information you’re processing.

Brian Walker: So given that response, I’m just curious before we sign off, are there human resource challenges related to digital privacy that we haven’t addressed?

Matt Silveira: Yeah. In fact, there is another IEEE standard, 7005, which deals specifically with human resources, and that could potentially be its own podcast, but to summarize it, the space between the employer and the employee, the marketplace, the recruiter, various other business sub-functions, including HR, insurance, payroll, and so forth, all play a key role, as well as some of the other obvious things like employee reviews, in that privacy relationship. And if businesses ignore that or want to treat that as like a fast-food meal where they’re going to provide a kit for that for their employee, I believe they’re going to find that’s a poor choice. It takes a lot of rigor to implement those types of programs and have them comply with a lot of emerging law that’s coming out about that.

Brian Walker: It sounds like there’s a lot of opportunity in the digital privacy space for students or young professionals. Your thoughts?

Matt Silveira: Oh, yeah. And I would contend that’s probably more of a green field, and it’s certainly something that I would refer somebody to another privacy professional that specializes in that space, because there are lots and lots of pitfalls and pratfalls there.

Brian Walker: John?

John Wunderlich: The Digital Governance Standards Institute of Canada has a standard I’ve actually posted in the Zoom chat that we’re using. It’s called Qualification and Proficiency of Access to Information, Privacy and Data Protection Professionals. So if you want to know what it is you want to know, that’s a good place to start.

Brian Walker: John and Matt, thank you again for taking time to speak with us today. I think this has been a very informative interview. I just wanted to ask in closing if you had any final thoughts you’d like to share with our listeners.

Matt Silveira: Sure, John, I’ll let you go first.

John Wunderlich: I just want to quote a fairly prominent privacy lawyer I know here in Canada. He says, “Too many privacy people operate above the glass floor.” Everybody’s familiar with the idea of a glass ceiling, but what the glass floor is, is a privacy lawyer or a privacy policy wonk, such as I used to be in a corporation, will not pay attention to the details of how a policy or a directive from the corporate level is actually implemented. I’ve done my job. I’ve written the privacy policy or the direction, and now the IT people have to go and make it happen. Too many privacy people operate above that glass floor, and that’s, I think, what operationalizing privacy is about is breaking through the glass floor and building connections to the coal mine face of data operations and the office that’s setting the rules.

Brian Walker: Matt, anything to add?

Matt Silveira: Yeah, I think in terms of final thoughts, I’d like to encourage our audience to get involved with the Privacy Initiative. We have a number of active subgroups. As I mentioned earlier, our connected cars group, our healthcare group, our energy group, as well as our foundations group and the standards subgroup that John and I lead up, and I’d also encourage our listeners to get engaged and become students of privacy, because the field changes quickly. If you think you can read a couple things and check in and check out year by year, you’re mistaken, and I would suggest availing yourself of a lot of the good, free information that’s available from the Digital Privacy Initiative group.

Brian Walker: Thank you for listening to our interview with Matt Silveira and John Wunderlich. To learn more about the IEEE Digital Privacy Initiative, please visit our web portal at digitalprivacy.ieee.org.