IEEE Digital Privacy Podcast Series: Episode 1


Stuart LipoffA Conversation with Stuart Lipoff
IEEE Life Fellow
VP Industry and Standards Activities, IEEE Consumer Technology Society
President, IP Action Partners

Listen to Episode 1 (MP3, 22 MB)


Part of the IEEE Digital Privacy Podcast Series


Episode Transcript:

Brian Walker: Welcome to the IEEE Digital Privacy Podcast series, an IEEE Digital Studio Production. This podcast series features conversations with industry and academic leaders, as well as key stakeholders of digital privacy in order to help advance solutions that support the privacy needs of individuals. In this episode, Stuart Lipoff, an IEEE fellow and president of IP Action Partners, shares his insights on digital privacy, highlighting challenges and actions needed to advance the technology space. Stuart, thank you for taking time to contribute to the IEEE Digital Privacy Podcast series. To get started, can you please provide a brief overview on digital privacy?

Stuart Lipoff: Certainly, so digital privacy is all about what we call PII. That's “Personally Identifiable Information” that someone collects from you. And there's really two ways in which that information kind of been collected. One of them is unwillingly through spying, where wiretaps or listening in to different sessions that you may have on the Internet or using your cable products or something like that. The other way more common that people think about is where you willingly share this PII with a third party in exchange for some particular benefit. Typically, a social media company or email service provider where you're giving them your PII and they are giving you something in exchange. And what digital privacy is with respect to this, it's all about a means to protect access to the PII and prevent the release of PII to anyone you don't want to have that information. So, it's a matter of control.

Brian Walker: Can you explain the interaction between social media and end consumers as it relates to digital privacy?

Stuart Lipoff: Okay. So, with respect to the category of willingly share. So, we willingly share our PII with social media in exchange for a variety of benefits, entertainment or services such as free email. This is, you know, Facebook and Twitter and Google and so forth. However, you have to remember that because the social media service providers are businesses, they expect to use our PII for financial gain. If you kind of think of it as you're paying for these services by giving them your PII and they use it for advertising, they sell the PII to third parties who then use it to foster their business. So, our sharing of the PII with social media providers has consequences that consumers may not fully understand or appreciate, that they know that they're willingly giving it up, and they know that they're getting something in return. But they may not fully understand the possible negative consequences that can occur from things like ID theft, redlining, digital redlining, blackballing them, pigeonholing them. This can result in financial loss from the ID theft. The blackballing and pigeonholing might result in them being charged higher insurance rates or being denied the ability to join a club or even to get certain kind of services. So, we do pay for social media with PII, but we really need to understand what it is that this PII will be used for.

Brian Walker: So, Stuart, what are some of the primary concerns related to digital privacy?

Stuart Lipoff: Okay. So, for the first category of PII that's collected by spying, we need to make sure that the technology we use is clean and that digital pathways in and out of our home is securely encrypted and protected, by clean, what I mean is our computers need to have the spyware installed and we need to make sure we install programs from known parties that have been checked and validated. And today it's not just our computers, it's our cell phones, but it's also the emerging world of Internet appliances, such as baby monitors, home intercoms, digital doorbells, thermostats. All these pieces of technology are capable of collecting information, and funneling at the third parties where we don't want it. And with respect to the communications in and out of our home, if we send messages and email and we do financial transactions, we should make sure that we're using encrypted technology. We should make sure that the locks are turned on in our browser which protect the information and so forth. However, the category I think that most people are perhaps less understanding is a second category where we willingly share it. And there's really three topics that are issues of concern. And I would lump them into trust, transparency, and control. Trust is do we really trust that the organization that we're sharing the information with will actually do what they promised to protect the PII that they are responsible companies. Transparency is the ability to really understand what they promise they're going to do to protect it and to disclose it. And too often we're so anxious to get some really nice app on our cell phone or to use a free email service that we see three or four pages of legal fine print, and we quickly check the I agree box without really fully understanding what's involved. We really need to understand what that is. And the last aspect is, is probably something that's not really considered too often, but it's control. And it's not just the aspect of controlling who you are authorizing the PII to be shared with. In the event you decide you no longer want it to be shared, you really need the ability to revoke the disclosure. And if somebody posts or provides inaccurate PII to third parties, you need to have some means to have it remediated.

Brian Walker: What steps can be taken to regulate the digital privacy space?

Stuart Lipoff: So, with respect to regulation of anything, there are really two basic tools that we tend to use. We use one tool called prevention, which tries to prevent people from violating your digital, your privacy in the first place. Then we use deterrents which are after the fact means we use to prevent the next guy trying to do it. And on the prevention side, the classic thing is we think about laws and regulation by governments. But that's generally not a really good choice in the case of a digital privacy, because we have such rapidly changing and emerging technologies that whatever laws we come up with usually lag what the technology is and don't actually address the current issue at the time. And worse than just lagging the technology and not being effective, they generally inhibit innovation because they come up with the laws and regulations, say, you can't do this. And what you might want to do is some new idea. That's a great idea that everybody wants, but the law is kind of falling behind. So really on the prevention side, historically, the best means are self-regulation, where the industry as a whole, rather than any one company, gets together and develops codes of good practice. And they widely promulgate that, and they achieve widespread buy-in by the key players. So, this is currently done, for example, in the advertising industry where, you know, there are codes of good practice that are published and the companies that adhere to them subscribe to them. And people depend on the fact that they have the advertising seal of approval, the Good Housekeeping seal of approval, whatever else it is that takes it out of the realm of government which puts it back in the private industry. The deterrent tool is kind of more obvious. And if there is a law, you need vigorous and highly public prosecution of the lawbreakers. It's kind of interesting we do that today with respect to the Secret Service in our currency. There's very little loss associated with people doing counterfeit currency, and we spend a lot more on the Secret Service than we actually lose because we enforce it so vigorously. That's why there's very little counterfeiting. But when you don't have the laws in place, the more effective way is perhaps the public disclosure and outing of the bad actors and bad practices. So here the press has a real role saying this company is doing something you really don't understand and it's not a good idea and is not accordance with the industry code. And by publicizing that, we can steer people away from those bad actors.

Brian Walker: So, Stuart, can you speak to some of the main challenges for ensuring digital privacy?

Stuart Lipoff: So, where we have rapidly developing and emerging new technologies and services, the real challenge is really kind of understanding what are the potential threats and not just from the services we have today, which is bad enough. But also, from the new services which don’t even exist. And so, what can go wrong and in understanding what the potential threats are. We can then start to develop or think about some of the countermeasures. So, whether it's to prevent you from being spied upon, it might be developing encrypted means of sending emails in ways that relate to PII. It might be mechanisms that require every time your personal information is disclosed, whosever disclosing it notifies you and gives you the opportunity to look at what was disclosed and to either agree that it's accurate or not. But in such a young and developing field, it's very difficult to deal not just with the current challenges, but even much more so because of the rapid change in development where the challenges that we need to deal with we don't even know about yet.

Brian Walker: So how is IEEE working to make positive steps in improving digital privacy?

Stuart Lipoff: The IEEE does have at the institute wide level, which means it cuts across all the different technology sections - computer, communications, information technology, consumer electronics. It's a group called the Digital Privacy Initiative that started fairly recently, and it's in the process now of kind of studying what the issues are and will probably lead to taking its findings and distributing it back down into the individual groups. So, what it learns, that is relevant to the computer will go to computer, what it learns relative to consumer electronics to consumer electronics. But at the moment it's fostering a variety of discussions across all the different parts of IEEE to try to get a better handle and understand of what's going on. And I would say the Institute, in recognition of what its basic mission is, which is to foster technology, innovation and excellence for the benefit of humanity, it's really pursuing four of its major activities, which it has always done, but focusing them on digital privacy. The first one is research, which is working with the academic community and with the labs of large companies to gather publications which you can put into the magazines and transactions and share that research among the group. It's involved in education, where it holds workshops and publishes tutorials and tries to educate the industry. And in some cases, by releasing it to the press, the general public, what some of the issues are. It has a large standards development activity on the standards development side. We develop the standards, and they range from secure communications channels and somebody's tapping into your Wi-Fi, but also good practices and underlying technologies like blockchain. And then we also have an advocacy activity where we actually, this is the first of the these four, where we operate outside of the IEEE. We typically engage with government and non-government organizations, consumer protection groups and other stakeholders where we try and develop a shared understanding of the needs and we communicate to them what the limits and capabilities of the technology are. And they communicate to us what some of the policy and requirement needs are. We do this through white papers, we do that through inviting government and NGO stakeholders to speak at our conferences and we speak at their conferences.

Brian Walker: Stu, thanks again for taking time with us today. Do you have any final thoughts you'd like to share with our listenership?

Stuart Lipoff: So, with respect to the current situation, I think we currently understand that digital privacy as it relates to social media, as it relates to people perhaps putting viruses in our computer and this sort of thing. One of the things that's rapidly coming down the road that we have really, not yet having to deal with is a category of called Internet of Things. It's usually referred to as IoT. And IoT is where you put computing devices, microprocessors, little computers into appliances, and you also have them communicate with the Internet. So, the things that we're putting these microprocessors in are not what we normally think of as computers. There are refrigerators, washing machines, baby monitors, digital doorbells, smart thermostats. Devices that control our energy use and monitoring, all these devices are collecting a great deal of information about us. Generally, it's being sent to where we want it to go to control our thermostat and this sort of thing. But to the extent there's such a large number of new companies who are making this very often in countries where regulations are not well promulgated, we've already run in these situations for thermostats and baby monitors have had on the part of the manufacturer malware or bad software that's been started deliberately for the purpose of spying on us. And this is going to be an issue going forward because today where we only have a small number of companies who are making communications devices, computers, and cell phones, and they're large companies, we can regulate them. Well, we’re going to run into situations where there's maybe tens of thousands of companies making little devices that are sold to us on the Internet or in the big box stores. And we don't really know much about these companies or what's in these devices. We have a whole new set of risks associated with being spied upon.

Brian Walker: Thank you for listening to our interview with Stuart Lipoff. To learn more about the IEEE Digital Privacy Initiative. Please visit our web portal at