IEEE Digital Privacy Podcast: Episode 14
A Conversation with Nandita Rao Narla
Senior Fellow at Future of Privacy Forum; Head of Technical Privacy and Governance at DoorDash
Listen to Episode 14 (MP3, 20 MB)
Part of the IEEE Digital Privacy Podcast Series
Episode Transcript:
Brian Walker: Welcome to the IEEE Digital Privacy Podcast Series, an IEEE digital studio production. This podcast series features conversations with industry and academic leaders, as well as key stakeholders of digital privacy, in order to help advance solutions that support the privacy needs of individuals. In this episode, we speak with Nandita Rao Narla, a Senior Fellow at Future of Privacy Forum, where her research focuses on privacy engineering. Currently, she is the Head of Technical Privacy and Governance at DoorDash, where she leads the privacy, governance, assurance and operations teams. Nandita, thank you for taking time to contribute to the IEEE Digital Privacy Podcast series. To get started, can you share a little information on your background?
Nandita Rao Narla: Sure. Hi, I’m Nandita Rao Narla. My background is in computer science, and I have a Master’s Degree in Information Security from Carnegie Mellon. This is where I first learned about privacy through the Cylab research projects. I’ve been working in privacy and privacy adjacent domains for the last 13 years. I started out in cybersecurity, building large scale security, data governance, and privacy programs at Fortune 500 companies. Post GDPR, I started to focus more on privacy engineering, and I was part of a founding team of a privacy tech startup. After the startup, about four years ago, I joined DoorDash to build their technical privacy and AI governance team.
Brian Walker: Can you also share a little information about your current role as it relates to digital privacy?
Nandita Rao Narla: At my current company I lead a team that builds privacy enabling tooling and features, along with owning privacy assurance, privacy operations and AI governance functions. Outside of my day job, I’m involved in a number of open source and nonprofit privacy initiatives. I’m working with, Institute of Operational Privacy Design, the IOPD, to build a privacy by design assurance standard. We released a version last week and its open for public comments. I’m also a senior fellow at Future of Privacy Forum, which is a privacy think-tank where I’m conducting research and privacy engineering in collaboration with researchers from UC Berkeley. I also participate in knowledge sharing sessions with industry and academia collaborative initiatives. And that is how I was introduced to IEEE Digital Privacy Initiative.
Brian Walker: Nandita, what is your view on some of the current initiatives that are underway, and which ones do you think are helping to improve the digital privacy space?
Nandita Rao Narla: There are so many good initiatives underway. I’m going to just talk about the three that are top of mind for me right now. One is NIST. NIST is doing very impactful work in the digital privacy space. They’ve published frameworks, several profiles and standards for enabling privacy and privacy enhancing technology and organizations. I was part of their Privacy Working Group, which was developing content for the NIST Privacy Workforce Taxonomy, basically resources to help privacy practitioners adopt the NIST privacy framework in their organizations. Another good initiative that I found a lot of value in is privacypatterns.org they published a set of open privacy by design patterns that software engineers can use to incorporate privacy early in the design phase. Openmined has several courses on privacy, and has several open-source projects in the federated learning and differential privacy space. So there’s a lot happening in these three initiatives. In addition to tons of educational materials and learning resources available through workshops and on LinkedIn, through conferences and learning programs and certification programs.
Brian Walker: In your view, what are some of the key issues security professionals need to be aware of about privacy?
Nandita Rao Narla: Both security and privacy are closely related topics. So there’s a lot of overlaps. I think it’s important to remember that security can exist without privacy, but the reverse is not true. Security is absolutely a must for privacy. And in terms of differences, security objectives focus on safeguarding against risk to organizations. When I say risks. I’m thinking about confidentiality, integrity and availability. However, privacy focuses on safeguarding against improper data processing risks, which is to humans. So there’s a fundamental difference between how security professionals and privacy professionals think about who the core audience is and what should be protected. Security professionals need to understand that privacy is still evolving, and there are a lot of gray areas in privacy. Three that are top of mind in privacy – there’s a huge amount of reliance on regulation to define requirements, and that differs by jurisdiction. Even the definition of what is personal information varies widely. So baseline understanding of key regulations and frameworks is helpful. Privacy expectations from customers and individuals also vary across geographies and cultures. So that also adds complexity when we try to implement privacy in organizations. The third one is – it is important to understand how data is being used versus just who has access to it. For instance, if phone numbers are collected for identity verification, it can definitely be used for sending 2FA codes (two-factor authentication). But again, the same phone number cannot be used for sending SMS alerts for new deals. So just having access controls in place is not going to be enough. And this is something that security professionals need to know.
Brian Walker: What role do you think standards play in helping to advance the digital privacy space?
Nandita Rao Narla: Yeah, digital privacy is rapidly evolving. So standards help to provide some sort of consistent guidelines and some level of benchmarking that organizations can align with. For context, privacy functions in most organizations are driven by regulatory requirements, so the laws are technology agnostic, and the requirements that are defined in these laws are at a very high level, and they’re very hard to translate into technical terms. So this is when standards can help in providing a very clear framework to follow. Another good benefit of using standards is it helps to demonstrate accountability, especially for conformance standards, standards where you have a third party providing an attestation that, yes, this company did a good job and aligned with this standard. So, one of the reasons why privacy enhancing technologies are yet to become widespread is because of the standardization efforts in this space are still lagging, especially for differential privacy and synthetic data there, there is a lack of standards which is hampering the adoption of these technologies. I do want to point out that privacy standards are not widely used in practice. I recently read this paper at the IEEE Euro Security and Privacy Workshop on compliance as a baseline, which found that half of the privacy engineers interviewed did not use standards in their day-to-day work. So while we know that standards are helpful, it’s not being used widely in practice by privacy professionals.
Brian Walker: Right. Do you see that situation changing, though?
Nandita Rao Narla: I think it will evolve, but I don’t see that changing anytime soon. And that’s because, again, a lot of companies rely on treating the regulatory requirements or compliance requirements as the ceiling. So that’s the maximum that they are willing to do versus, going above and beyond and really protecting data or really protecting privacy of individuals. So, it’s in the right, like it’s trending in the right direction, but I don’t see that changing immediately. It will take some time for standards to become widely adopted.
Brian Walker: You recently gave a talk on privacy threat modeling. Can you provide our listeners a general overview on privacy threat modeling?
Nandita Rao Narla: Yeah, I’ll give a very short overview of privacy threat modeling. There’s a lot of resources, available and very formalized privacy threat modeling frameworks that are available, which are easy to use, but just to give a macro view and a summary of what it is. So, privacy threat is anything that can cause privacy harm. So by privacy harm, let’s say something like surveillance, or chilling effects where you’re not able to express your thoughts because you feel like you’re being watched. Dan Solove’s taxonomy is a great resource for learning about these privacy harms. So privacy threat modeling is assessing, documentation like architecture diagram data flows or system designs to find privacy harms and to address them proactively. You can do this by following a four-question framework, from Adam Shostack basically asking four questions: What are we working on; what can go wrong; what are we going to do about it, and did we do a good enough job? This last question is mostly for a feedback loop and a validation exercise, so that we can improve this process going forward. But by essentially asking these four questions, you can identify issues early on, before any code is written early in the design phase of software development, and these threads can be prioritized and fixed early in the process.
Brian Walker: Nandita, what advice would you have for students or young professionals who might be interested in pursuing digital privacy as a career path?
Nandita Rao Narla: For students interested in privacy careers, I would encourage them to take privacy related courses or participate in privacy research projects. That is how I got started in privacy. For young professionals, I would encourage them to attend workshops and pursue certifications in privacy. Both IEEE and, IAPP have very good Privacy Engineering certification resources available, so that could be a good learning path. A good way to gain practical experience is to see if your organization supports rotations with the privacy team, even if you can do a 20% rotation. It’ll be great to get some hands-on experience in privacy. These privacy teams are always short staffed, so they would welcome any help they can get. So that would be one good way of gaining some practical experience. If someone is in an adjacent privacy field, like, let’s say, trust and safety or security risk and governance, you may be able to transfer your existing skill set onto privacy careers pretty easily. For instance, if somebody has prior experience performing code reviews or architecture reviews, they can adopt, privacy by design methodologies and translate those skillsets very easily into privacy careers in the future. If somebody has worked in data governance, they can use those skills in privacy data mapping and data minimization projects. So a lot of these skills are transferable and can lead to a good change in career and be able to leverage those for, privacy.
Brian Walker: Nandita, thank you again for taking time to speak with us today. In closing. Do you have any final thoughts you’d like to share with our listeners?
Nandita Rao Narla: I think that privacy issues are deeply influenced by ethical, social and cultural factors, so we need people from diverse backgrounds in privacy. And we encourage it, like the privacy community needs diverse skill sets. So I’d encourage more people to join the digital privacy community, share their knowledge, and enrich the community.
Brian Walker: Thank you for listening to our interview with Nandita Narla. To learn more about the IEEE Digital Privacy Initiative, please visit our web portal at digitalprivacy.ieee.org.