Wednesday, May 22, 2024

Privacy in the Digital World: Architecting Solutions

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Architecting the privacy that the digital world demands is critically important – yet it’s a more daunting task than ever before. Among other challenges, universal compliance with the requirements of GDPR remains elusive at best. Companies have ever more powerful data analytics, but are they respecting the need for consumer confidentiality?

In this webinar, we discussed:

  • What are the biggest privacy gaps remaining for CIOs and their enterprise architects to solve?
  • Most companies architect overarching business capabilities for their customers that link data between systems. What needs to be done to ensure consistent data protection between systems?
  • Marketing software vendors suggest that companies create massive customer data repositories? Is it possible to architect these to protect privacy at the same time?
  • Foundational to privacy is data governance. CIOs have struggled historically with governance. What is your best piece of advice for the CIOs who may be listening?

To provide insight into the future of this key technology, I spoke with three leading experts:

Michelle Dennedy, former Chief Privacy Officer, Cisco

Myles Suer, Head of Global Marketing, Dell Boomi

Ann Cavoukian, Author, Privacy by Design. Former three-term Privacy Commissioner of Ontario

Moderator – James Maguire, Managing Editor, Datamation

 Download the podcast:

Data Privacy, the GDPR and Facebook

Maguire: Alright, so let’s talk about data privacy. There’s the GDPR, General Data Protection Regulation, legislation enacted by the EU. The question is, what percentage of firms have really made the changes to live up to the GDPR, and do we need a United States-based GDPR? Michelle, what is your sense of that?

Dennedy: We actually studied this when I was still at Cisco, and the wonderful Harvey Jiang and Robert Weidman have continued those studies looking at people who feel like it’s not a toggle, it’s, “Where are you from an ad hoc compliance to GDPR, to PIPEDA, to LGPD, to CCPA, the rest of the alphabet soup, from ad hoc when there’s an emergency, all the way to optimize?” Which is level five. And what you find is, I think greater than 50% think they’re probably a level three. They know what to do, they might have a couple of policies. Has it cascaded throughout the text stack? No way, Jose. But there’s at least a plan and, in many cases, an executive who’s accountable. So that’s question one is, how many have done it? And doing it is also running along a moving train, we’re changing what it is. The Privacy Shield of course was just vitiated a couple of weeks ago, so compliance with the overseas transfer of information under GDPR and other schemas, the game has changed again.

“So there’s a young man named Max Schrems who dropped out of school to sue Facebook, and he’s been doing it very successfully for a while. And basically what he said was, “Because the US government has so much of an intrusive influence and capability to look into data streams from Facebook and some of the other giants, there cannot be a way for Facebook to legally provide adequate protection of European data in the US.” And just a few weeks ago, the Court of Justice agreed…”

Maguire: So certainly, there are gaps there in terms of data privacy. Myles, I’d like to draw on some of your expertise with CIOs. What can CIOs enterprise architects do to close some of these gaps? And I sense maybe the gaps are getting bigger even as we speak, but is there a solution there? What are the biggest gaps they need to solve?

Suer: “Well, obviously, they need the support of the board and the CEO and maybe even the chief marketing officer. But they realize that there’s a problem here. It’s been interesting to me that some of the people who are the most on GDPR in the US have been universities have been saying, “Yeah, this is exactly what we’ve been working on, this just codifies it.” And oh, by the way, they had to worry about it ’cause so many of their students were international. But with that said, I think there is a belief now for the first time that privacy needs to be architected into data. That we actually need to think about, “Who can see this data?” But on the other hand, I’ve seen my friends at Adobe talk about [a massive] data repository so that they can do a predictive model on customers. If it’s not designed with GDPR and Privacy by Design, I don’t think it has a chance. And I asked CIOs recently this and they said, “You have to architect it from the beginning.” Just accepting somebody’s marketing, Uber’s situation is not gonna work.”

Cavoukian: “You have to be proactive and bake the privacy protected provisions into your design. It’s absolutely essential. You have to bake it into your code and it has to be a model of prevention. You can’t leave it for afterwards so hopefully, you’ve met the privacy laws, GDPR or whatever. It’s too little too late. When I created Privacy by Design, which was 20 years ago, it took some doing because most chief privacy officers and commissioners were lawyers. And so introducing a model of prevention was a completely new model and it took some doing. But then in 2010, Privacy by Design was unanimously passed as an international standard by the International Assembly of Privacy Commissioners and Data Protection authorities.”

Do Consumers Really Care About Privacy?

Maguire: It seems like isn’t the real issue that we consumers don’t even partially care about privacy? We will put everything on Facebook. We’ll put the details of what beer we drank is on Facebook. My sister’s last name, everything, we put it out there. We want convenience, we want quickness. Privacy? Eh. We’ve almost thrown up our hands…

Cavoukian: “This is making me absolutely crazy. See, that’s the mindset that a number of people, obviously like yourself, have. It’s completely wrong. People care deeply about privacy. As I mentioned, public opinion polls, Pew Research, the last two years, concern for privacy at an all-time high. 90% of people are very concerned about their privacy and 92% are concerned about loss of control over their information. To me, privacy is all about control. Personal control over your data. The Germans have a great term for it called “Informational Self-determination” that the individual has to be the one to determine the fate of his or her personal information because context is key and only the individual knows the context. Just because you’re on Facebook, you don’t want the whole world to have access to your information, that’s why they’ve been getting so many complaints and cases adjudicated against them. In the US, $500 billion they had to pay last year.

“So don’t fall into the misbelief that people don’t care. People care deeply. And that’s why companies like Microsoft, who I’m working with now, have developed decentralized identity. This is growing, where the decentralization of identity, and there’s now globally a Decentralized Identity Foundation, it’s growing that big, where your identity information will live in a secure enclave in the Cloud, for example, under your complete control, and only you can gain access to it with verifiable credentials. This is growing, so please get rid of the myth that nobody cares about privacy. Nothing could be further from the truth.”

Dennedy: “No, all I have to say about Mark and Sheryl, call me, sister, because I can help you. It’s not too late. But the reality is, if we take the debate and the desire to actually have data that is respected, data that describes, observes, and is proactively shared by human beings, if we reduce that to a Facebook, we do ourselves a grave disservice. Remember when in the early, like late ’90s, early odds, there was a myth that no one would bank online. “No one, that’s insane, that’s so stupid.” And I remember in 1986 working for a small credit union, that’s when the ATM, the cash machine was introduced. And if you look at it from a security perspective, “What? You tap in four numbers and it spits out bills into a bar? Woah!” Every security person in the world is like, “What the hell is this?” So I think starting from a perspective that only one company gets to dictate what everything is, is wrong, so let’s take them and put them in their naughty corner right now. There’s a lot of naughtiness going on there. They’re not beyond repair.”

Maguire: While compliance is noble, in the absence of enforcement resources to pursue non-compliance, which is even more of a problem during the current pandemic, why should companies go beyond building only a veneer of privacy just sufficient to support good faith defenses?

Cavoukian: “Oh God, don’t be ridiculous. It’s so stupid. I do a lot of public speaking, I talk to consumer groups and people, they ask amazing questions. They dig below the surface, they’re not these stupid individuals who you think, “Just give them the veneer of privacy and that will shut them up.” Are you kidding me? They care more about privacy than… Not than me, I was gonna say than I do, but equally. The depth of the questions… 10 years ago when I would do public speaking, I’d have to explain to them what the issues are and what they should be concerned about. I never have to do that anymore. They have unbelievably deep understanding of these issues, and they’re walking away from companies who don’t respect their privacy and offer strong protection. So, whoever it was who just asked that question, with deep respect, get into the 2020 instead of 10 years ago. Things have changed. People care deeply about privacy and distrust is huge. There is such a trust deficit and maintaining that kind of attitude will just make it grow.”

Dennedy: “I think even as late as 2008 during the last financial crisis, you saw swathes of privacy professionals being laid off because they seem to be nice to have compliance people. Now, people are already, “Before I download this thing and it might be something that helps save my life or get my community economically flowing, I will not do it if my privacy is compromised.” And it’s not privacy as secrecy. It’s privacy as, “I share when I wanna share, with whom I wanna share, and you don’t have the right to take that information if I’ve shared it to you with integrity, you don’t have the opportunity to just go and sell it to transform my political beliefs or try to guess my emotions.”

Business, the CIO and Privacy

Maguire: We have a question from Brandy Bennett. She asked, and I think it’s a good question, “Aren’t we asking too much of consumers if we shift all control to users? My mom has zero clue what we do with data or what it means.” Myles, what’s your take? Can consumers really handle all the power?

Suer: “Look, when Facebook allowed me to put posts out that were only going to my friends, I jumped on it, I didn’t want everybody. But when we think about data, and I’m coming back to the Michelle’s point, think about the app, and I’ve been trying to buy one but you can’t get them until the crisis is over, the temperature thing, where they… With privacy protective, that’s the first thing they do, they aggregate the data and then present that for public good, and they’ve been able to predict where COVID hot spots are. And so look what can happen if we can trust the privacy to help the social good. Because I don’t have a problem if anonymized aggregate data is being used to predict that I’m in a hot spot and something needs to happen because it’s gonna help my neighbor. So, those are the kinds of things. But I don’t want to be identified as one of the people who potentially has COVID, I wanna deal with that myself.”

Cavoukian: “But it can’t be the common good versus privacy. That’s a whole dated, zero sum model of, “Oh, sorry, we can’t do privacy because we’ve gotta protect public health interest.” Or the terrorist incident, “We have to forget about privacy because there’s terrorism.” You have to address both. With the existing public, the whole COVID thing, you can easily do both. Apple Google partnered together to develop… It’s commonly called a contact tracing app. It’s not contact tracing. They called it exposure notification, so that you could be notified if you’re exposed to COVID. And Apple briefed me on this on two separate occasions because I always wanna look under the hood. I always say, “Trust but verify.”

Maguire: So companies, they’re architecting business capabilities for their customers that link data between systems. What can they do to ensure the data protection as they try to create these linked systems?

Cavoukian: “Well, first of all, the links between the systems, the customers have to be aware that they’re doing this. You have to provide notice. If you wanna share my personal information with another company and I’m not aware of it, I don’t want you to do it. Now, you may have very good reasons for doing so and if you let me know, that’s no problem. But this can’t be done behind the scenes and engaged in, because what happens is if you do that and you’re connecting information in an identifiable way and there’s a data breach, that’s it. You’re gonna fold. These days, when there are data breaches, there are are not just lawsuits, there are class action lawsuits that arise.

“People have had it with having their information accessed in these ways that they have not consented to, they haven’t even been provided notice. So at the very least you can do that. Or you can de-identify the data in terms of stripping personal identifiers from the data. You can substitute something that enables you to connect, but it’s not identifiable, like you’re not going to know it’s Ann Cavoukian’s data or someone else. There are a number of things you can do, and there are so many protections you can build in that it’s on you, the company, to do, not put it on the consumer.”

30:32 JM: Well, let’s look to the people that are running the infrastructure, the CIOs out there. Obviously, data governance is critical to them, maybe they may or may not be falling down on the job in that regard. What piece of advice would you give to CIOs listening that want to do a better job of handling privacy? Myles, your advice on that?

Suer: “I think you start by determining what data you’re collecting and asking yourself the question, “Why do I need that piece of identifiable information? What pieces of information I’m collecting could identify a person?” If you add them all together, it gives you… There’s a set of identifiers for HIPAA, I think it’s 21 if I remember, identifiers. “What are those things that we think will identify?” And then ask yourself, “Why does this piece of information flow between this system and that system? Is it essential? Is it not essential?” So start asking that.

“And then when you build your marketing engines and things like that, start asking, “Okay, which of these pieces can be seen by which person? Do I hide some of the information from some people? Do I not allow them to see it? How do we do that?” ‘Cause if you look at GDPR, it says a German person can’t see a French customer’s information. So how do we… You have to start with architecture and enterprise architects. I know that there’s a lot of work going on in OpenBridge for example on this. Enterprise architects are key to thinking about, “Where does data flow? How does it flow? And then how do I build an architecture that’s gonna protect data? If I do that, then if they break in, they’re not gonna get very much.”

“So design for it. And guess what? The consequences of a break-in become much less because no one person… And that’s the other thing, when you design this, you can’t have this DBA with all of the passwords for everything. So you have to start to be smart about these things. And CIOs are starting to really think about, “How do you do this right?”

Dennedy: “Yeah, I think we have to look at… There’s CIOs and there’s CIOs. There are CIOs who are technical experts and they’re keeping the lights on and they’re doing plumbing and it’s really hard work and hurrah. There are CIOs that are truly chief information officers, and they do have security and privacy as equal a functional specification to, “Are the lights on? Is the email running? Do we have cameras working in the offices?” So sometimes it’s a dispositional thing. And I’ll bring it back to the question that Brandy asked earlier, “Isn’t it too much to have the user doing everything?” Answer, yes. We have to think about this as a systems engineering process just like you think, “Can I just eat carrots and have a healthy body?” No. “Can I do intermittent fasting and not eat food when I’m not intermittently fasting?” No, you will die. You will die. And so if you look at everything as, “I can use encryption, I can use anonymization, I can use dual factor authentication… ” If you look at each one of these as some sort of silver panacea bullet, you will fail. If instead you invite your users to the table, and this is where this ownership question gets tricky is…

“So my background before privacy in the last century was in intellectual property law. Thinking about the bounds of what is a creative notion, us together in this webinar, we’re sharing our personal stories, our faces, our voices, our biometrics, our experience, but we’re also individual people. There’s never a time when Ann and I are not married in this hour and Myles and James and I are not married in this hour. So you have to think about this as almost like, back to Mr. Ritter, quantum privacy, really thinking down to the data element in context over time.”

Maguire: Ann, I wanna give you chance to answer that same question about CIOs. We won’t delve too deeply into that issue, but what advice would you give to CIOs, people really architecting the infrastructure, in terms of making the consumers have privacy? What’s your best advice to CIOs?

Cavoukian: “And I agree with everything that Michelle said. I sit on a number of boards and I often work directly with CIOs, and the first thing I try to understand is exactly what they need to deliver. Because sometimes you have to talk… Once I get an understanding of that, then I talk the CEO and say, “Your expectations of your CIO are unreasonable.”

“They put the CIO in a position where they have no choice but not to protect people’s privacy because you’re asking them to make all these connections which they can’t do in a privacy protective manner. So I think a big part of it, the CIO has to have the ear of someone who understands privacy and can communicate that to the CEO in terms of the expectations made on the CIO. After that, I talk about what information they have in place and I ask them, “Do you have a data map? Do you know how to navigate the flow of personal information that comes into your company or organization? Because you may have the necessary consent at the first instance where data are collected from individuals and everything is consented to you, it’s fine, but then after that, if the data flows throughout your company in a way that is not privacy protective and there are many third parties and secondary uses of the data that were not consented to, then that’s a huge problem.”

Suer: “This idea that Ann’s talking about of doing a data map, you need that anyway because it’s gonna help you in your big data projects, it’s going to help you in building better business processes. The only thing you have to add to that question is the questions Ann’s asking along the journey, which is, “Do I need this piece of information? Can I de-identify it? Okay, let it flow across but put these privacy restrictions consistently across it.” So we’re not asking people to completely change their processes or add new ones, we’re just asking them to think about privacy along the journey anyway.”

Dennedy: “Yeah, and every big data failure out there, and there are thousands of bodies along the IT road of big data, it has come down in a really fascinating way, typically it’s data quality problems. And part of data quality is, is this data associated with the right customer or employee? Is this data in context and does it describe X, Y or Z process? And that’s your data quality problem. If you’re solving your quality problems and you have folks already in your teams that know how to do data science and analytics and quality, you marry them together with your security and privacy team and you can catapult yourself out of the land of COVID into the land of what’s next.”

Maguire: Alright, so here’s our last question. It comes from Greg Cutsback. He says, “Linking data between systems, as stated by James, is the subject of this question, sharing on behalf of others is making headlines by Instagram right now. Instagram’s photos are allegedly being used for facial recognition, a type of biometrics. Facebook denies these claims. I’m confused where opt-in or other privacy engineering tools could be used to improve this assuming Facebook, Instagram, we all use data given to it regardless of user intent.” Lots going on there. More of a discussion piece. Ann, how would you respond to that?

Cavoukian: “Look, privacy as the default means that Facebook, Instagram are supposed to protect the information they have in their possession. You’re right, you can’t put this on the individual, but they haven’t given their information to Facebook and Instagram for the whole world to make copies and distribute. There are certain expectations in terms of how this information is supposed to be protected and not picked up and copied and used by third parties. I was at a conference two years ago, I think, and there was the head of Facebook’s privacy group and she saw me and she said, “Oh, are you coming to my session? I’m gonna give a shout out to Privacy By Design. We’re doing that at Facebook now.” And I was shocked but then I said, “Thank you very much but I’m speaking at another session.”

“So believe it or not, Facebook and Instagram have Privacy By Design, if you figure out how to find it because it’s not obvious. That’s the problem, people don’t know they have it. But if you find it, you can go there and you can lock up your information very strongly and have it only shared with five people, your colleagues, your friends that you choose to. So you’re right that there can’t be an expectation that the individual is gonna know how to protect this data on Facebook or Instagram, but they are the ones who are supposed to do that for you because that’s part of the agreement and that’s what we have to ensure is enforced.”

Maguire: While we were speaking, I did this little really interesting Facebook quiz where I had to answer all these questions, like I had to put in my pet’s name and my favorite color. I uploaded all these questions and the goal of the quiz was to tell me what my son’s sign is. They didn’t get my son’s sign right, but I did answer like 50 questions. For some reason, they needed my Social Security number. I don’t know why, but it was a really fun quiz.


Dennedy: “Yeah, these age progression or put your face on a high school yearbook thing, they are trying to get your face to train databases people. So if you don’t know that, know it here, know it now.”

Suer: “Well, if anything’s gonna make you feel scared, it’s what the Chinese are doing with facial recognition. Imagine an entity tapping into what Facebook has and using it for bad purposes.”

Cavoukian: “What the Chinese are doing with facial recognition and social credit scores and everything else is just utterly appalling. It breaks my heart to even think about it, because where they’re taking these amazing technologies… Forget it. So anyone who asks you anything from China, don’t give it to them.”

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles