Brendon Lynch is the current Chief Privacy Officer at Airbnb. He is a privacy, data protection and data ethics executive with over 20 years of experience in the consulting and technology industries, including nine years as Microsoft's Chief Privacy Officer.
Doug Heintzman has been in the IT industry for 31 years. He is currently the Vice President of Strategy and Evangelism at Soveren. He was formerly a management consultant, a COO a founder and a strategy executive at IBM where he worked for 25 years.
Current Chief Privacy Officer at Airbnb. A privacy, data protection and data ethics executive with over 20 years of experience in the consulting and technology industries, including nine years as Microsoft's Chief Privacy Officer. А leader of high-performing teams who helps organizations build & maintain trust relating to the collection, use and protection of personal information. Brendon describes himself as a pragmatist that enjoys the challenge of enabling responsible data-driven innovation while respecting the privacy of individuals and building trust in data ecosystems and artificial intelligence. In 2017, Brendon was honored to receive the highest accolade in the world of data protection, the IAPP Vanguard Award.
What's the deal with privacy? Welcome to the Privacy Impact Podcast. I'm your host, Doug Heintzman. We've heard a lot in the press lately about privacy and data protection. There have been massive data breaches which have exposed customer financial information. There have been stories about data being harvested and sold and used to manipulate public opinion to influence elections. We've seen technology companies promote privacy as a key feature of their products and their brand. And we've seen governments around the world who are worried that some companies have too much power to draft legislation to balance that power with their citizens’ right to privacy.
Privacy is one of those issues most business leaders know that they need to be worrying about, but struggle to frame the issue in a way that they can reasonably do something about it. Today, we're very fortunate to have Brendon Lynch join us to help us think through the privacy landscape.
Brendon is currently the chief privacy officer at Airbnb. He was formerly the chief privacy officer of Microsoft. Welcome, Brendon.
Glad to be here with you.
Well, you've been around the privacy game for a very long time. Has it changed over the last 20 years or so?
Yeah, I have been involved in that in that time frame. I got involved in the late 1990s, first in the consulting world, and then got into the technology industry. And not too long after that, you know, I would say it's changed. It's changed a lot. And some of the fundamentals haven't changed at all.
I mean, the core concerns and needs around privacy are understanding what personal data is being collected, how it's being used, what rights users have. And there were various principles and laws around at that time that laid that out. But what has changed rapidly, I guess, over that time frame is, first of all, technology. So the amount of data that's being collected in a digital form has radically changed. The usage of that data has radically changed.
So it's applying these core principles to these new scenarios. But it's grown as a profession. It's growing in terms of the numbers of countries that have laws in place, that some of the countries that have refreshed their laws and the way organizations are dealing with that have really changed radically to, you know... I'm probably exaggerating the point here, but 20 years ago, it was mostly a sort of a back office compliance function, but now it's something much more prominent and strategic conversations at the board of directors level, organizations that are really seeking to even use privacy as a differentiator.
Well, on that last point, and I think this is kind of interesting, because certainly when I talk to companies about privacy, there really are these two issues that kind of come to mind. One is the raw compliance issue is the government or some legislation; and we need to figure out what we need to do so we don't get in trouble. We don't get fined. But the other is this the point that you kind of just touched on there.
There seems to be something different going on. There seems to be something having to do with the relationship that companies have with their customers that that as we we as consumers get more and more aware of, you know, both the value of our personal information to companies and the ways in which those companies use that data, that there's there's something quite fundamental in the relationship between a consumer and a company that, you know, comes into play. Do you see that aspect taking on greater importance?
Yeah, absolutely. I think, as you point out, people are aware of what data that's being collected and used to to an increasing degree. People are also expressing a lot of concern around that. Survey after survey in country after country show that people are very concerned about the amount of data that's being collected and used. They feel like they've lost control. And and so while it is a regulatory compliance domain — and that's a big, big piece: a key driver — I think that is what privacy work is all about in an organization.
It can also be very much a trust builder and be thought of as a key aspect of the relationship with the customer: the extent which which customers, consumers, and customers in the B2B sense trust the entity that they are dealing with increasingly about how personal data is collected and used is a key ingredient of that trust equation. And so it's become much more of a business issue and need; an opportunity as much as it is a compliance domain.
I think that a lot of us were quite amazed when Apple released their recent advertising campaign that doesn't really feature any of their products, but is all about privacy. And, you know, they're kind of putting privacy forward as a core attribute of both their products and their brand. And that seems to me very much about this phenomena you're talking about that there is something kind of relationship that you have with customers that, you know, respecting the fact that you are both collecting and then using their personal data, that that is a trust. Right, that those customers are entrusting you and that's a responsibility.
That is absolutely right. Yes, the responsibility, it's as you pointed out, it's respecting the individual and it very much can play into a brand and in itself can be a product or a product feature, if you like, of whatever the offering is that you have had privacy built in.
So in these last 20 years of kind of being in the space, have you seen, you know, in these sort of tangible ways - customer expectations changing with respect to what they expect from a customer? And the second question is, does that vary by geography?
Mm hmm. Yeah, I think it definitely does that individuals expressing, as I mentioned through these surveys, their concern the feeling of a lack of control, their desire for more control over the data and transparency as well. And what I think we've really seen over time is that at least a segment of the population is is more sensitive to privacy and what's being collected and how it's being used, taking advantage of, for example, their rights to be able to have access to the data.
We've seen that much more prominent in laws, but also in the way that users or individuals are interacting with organizations to actually take advantage of those capabilities. And so gradually there's this move, which means that ultimately there's more people aware of and concerned about taking advantage of it. And so that the more that an organization can empower the user base to have more access to and control over their personal data, the stronger that trust relationship will be with them.
So do you think that awareness or that that concern about privacy is something that is plateauing or, you know, how do you see it kind of evolving over the next five to 10 years? Or are we going to... Is it going to become more of an issue? Or will it just be kind of, you know, steady state, or is it a case that just more people will be aware of it?
Yeah, I think I think more people will be aware. And, you know, there's been some interesting research over the years, even dating way back to the 60s. We're sort of assessing people's views on privacy. And this really is the segmentation. There are some people who are always going to be very concerned. There are some people who are not that concerned at all and would give away their personal data for a very small reward. And then there's this large block in the middle that are very pragmatic in that they're making these trust decisions, whether it's conscious or unconscious, about interacting and trading their personal data, if you like, for value in return. So I think generally that will persist. But what's changing over time is this growing realization that the amount of an individual's daily life has almost a digital exhaust of data associated with it. The more that we use technology, the more that we have the mobile devices.
And so within that, I think the concerns are increasing. The very concerned are remaining very concerned. Those pragmatic people are looking for, sort of, how they can take back some of that control. And so the more that organizations can provide that and it's in their best interests.
You just touched on an issue that I'm absolutely, completely fascinated with, and that is the fact that our digital footprint has been growing. And a lot of what has been fueling that over the last number of years is just the fact that we are so connected and we search on so many websites and we spend time reading things and we like or dislike various different ideas on social media. And all that becomes part of our digital footprint, but we now seem to be moving into a world where, you know, we're all wearing smartwatches and we have these mobile devices in our in our pockets and we have various different health devices, blood pressure monitors or blood oxygen level monitors that we can, you know, consolidate, collect and consolidate data. And companies are increasingly using things like, well, certainly fingerprint or iris scanning or various different kinds of biometrics for, you know, access control and security and identity authorization and validation.
And that's kind of one issue. And as we have more and more of these IoT devices in our lives, we're generating all this data that then all the people looking to build business models around, you know, big data and artificial intelligence are going to be consuming just vast amounts of that data and building completely new business models around it. So, you know, it really seems to me that this landscape is going to become increasingly complex and nuanced. I mean, how do you see this as a data privacy, data protection professional? You know, how do you see, I suppose, your career moving forward? There'll be lots of lots of opportunities, I suspect.
Yeah, I do, definitely. You know, privacy is a growth industry. When I think back to the early days, just for example, the IAPP, which is the International Association of Privacy Professionals, you know, when I first got involved, it was probably about 100 hundred people. I think it's now up to like seventy five thousand or so globally and growing really rapidly. So there's definitely an industry growing around this. But it is, as you point out, all of those key trend lines which essentially amount to much more data being generated and collected, used in innovative new ways. And actually, if you look at almost every single industry, one key trend is the extent to which data is driving innovation. So it's a part of delivering the service. It's a part of ultimately driving what's next for them, how they improve their service, what new products or services they launch, even agriculture. You know, you go to places, things like that is the amount of devices. There's tags on the animals now that you can track movements and identify when a particular animal might be ill because of a change in the pattern of movement, you know, every single industry. So that is a key driver. It's an opportunity, of course, as well, because individuals are benefiting from all these great services and these data driven innovations are enriching lives.
But it does have this other side to it, which is one of the implications of the data privacy. So it is a growing field. All the things you mentioned and smart home devices are really on the rise as well. So we're going to see this continue to be a big issue.
Yeah. So you actually just touched upon a really fascinating idea that, you know, there is this tension because on the one hand, our, you know, company or even the government's ability to collect and process large amounts of data has a number of very significant benefits. Right. It allows interesting services and goods to be brought to the market. It allows for a lot of innovation. It allows for public health benefits. Right. You want to track what's going on with a pandemic or where you need doctors or if there's a cholera outbreak or or whatever.
So there's a lot of conflicting interests in play because on the one hand, we may not want, you know, people to have and use and sell our information. But on the other hand, by doing so, by companies having that data or governments having the data, they can deliver better services. They can, you know, better perform service and warranty and all those kinds of things, as well as be in service for some sort of public good.
So, you know, I see the space is really a, you know, a tension between conflicting goals. So, you know, how do you see, you know, do you see the pendulum swinging from one to the other as we see in different countries? Right. There's obviously different countries that have, you know, you know, prioritize one over the other or do you see technology coming into play that changes the dynamics of that tension and finds ways of allowing those conflicting interests to coexist?
Yeah, I am definitely in the camp of new innovation, resolving the conflict to a large degree. So it should not. Be a zero sum game that it's individual privacy or societal benefit as it relates to the collection and use of personal data, we should be able to get both. And that's where I think privacy technologies and architectures and advances come into play. For example, a lot of that societal benefit is derived from data. In the aggregate, it's not just my personal data linkable to me, Brendon.
It's my data in combination with hundreds of thousands or tens of thousands or millions of other data points in an abstracted way that's not linkable back to me. That can still provide that societal benefit and understand those big trends. And so that, I think is is really the way forward is how can we find the right ways to contribute data that relates to individuals into a pool in an anonymized way that can't be linked back to me so that I can contribute to the societal good, but trust that that's not going to be misused in any way to discriminate against me.
So you just once again, just brought up another interesting point that we probably won't have enough time to delve deeply into in this discussion. But, you know, the idea that, you know, private information is related to a person and that person's privacy can be to some extent protected by virtue of anonymization or aggregation and various different techniques. And the question is, is the information personally identifiable? So can it be reconstructed? And I always found it fascinating that there is this, you know, inherent contradiction of these is kind of what you'd assume upfront that, you know, normally you'd think that having information distributed amongst many different places is actually less secure than having it in one place.
But if it's in one place, then, you know, you can it's easier to assemble the information now if it is in a bunch of different places, you know, is there an issue around, you know, how distributed you make it and do some of these techniques and analytics and artificial intelligence give us tools or potentially malicious actors tools to actually reconstruct, you know, personal information that we wouldn't have thought was reconstructable?
Yeah, I think it is an interesting and definitely interesting topic area of that in some ways, concentration of data and in a place or a few small number of places may have some benefits from a security standpoint, because it could be that those entities have strong security measures in place.
And if you're thinking of data breach as a threat, you know, whether it's through mishandling of data or some malicious access to data, the more that you're protected against that you because of strong security measures, you have a benefit that on the other hand, if it is distributed, that's actually a good thing from a data breach threat some ways because they're only breaching one portion of your data and they can't won't have access to that that full amount.
But I think the key to all of this and the key to this is can data be used for societal benefit, used in these aggregated ways is to ensure is provable changes, alterations made to data that make it impossible to link back that even if multiple entities that had portions of the data relating to you somehow colluded behind the scenes, they wouldn't be able to reconstruct that. So there's going to be, I think, a whole wave of privacy, innovation across all sorts of different scenarios to solve for different aspects of these challenges.
And what of those scenarios that I think is a natural evolution of what you were talking about is the fact that we're likely moving increasingly towards an ecosystem based economy where smaller actors are working in concert to produce composite goods and services for the marketplace. So while I might have a, you know, a data usage consent with one member of the of some consortium or some ecosystem of partners, that that that member actually shares portions of my data with other pieces of that ecosystem that in such a way that is necessary for that ecosystem to deliver, you know, whatever this good is.
So, you know, the idea that I signed a contract with a company, that they're going to use my data in a certain way will get more convoluted because there's going to be more parties involved in interacting with me and or, you know, using that data for some productive use. And I think in that world, you know, that's that this whole thing will get even more complicated and in greater need of some of the technology that you're referring to.
Yeah, I think so. And I think it's you know, some of the modern privacy laws really do understand and address that scenario. So, for example, if we look at Europe - the general data protection regulation - are they designed to buy in the concept of the data controller and data processor and the controller is the one that I have my primary relationship, that I've given consent. They have a privacy policy, but they've really built into that law a recognition that there's going to be multiple data processes that may have access to or even have copies of that data to fulfill their part of that, whatever the offering is.
And they put the accountability very much with the data controller to pass on the data protection requirements to the data processor. But that's going to get very, very complex in these much more fragmented business models. But ultimately, I think that concept is going to remain that there's going to be a place where I have my relationship with and there's going to need to be accountability with that organization to make sure that all the other actors in the data ecosystem are aware of and compliant with the data protection requirements.
And I suppose that that, you know, built into that complexity will be the fact that some of these actors may exist in different geographies and those geographies may have conflicting laws that run afoul of some of the privacy regulations in a particular geography, and thus data transfer between companies in different geographies or even inside of a company between physical servers that exist in different geographies is already complex and probably become even more so.
Yeah, yeah, you're right. I think international cross-border data flows are an additional complicating layer of the onion, if you like, for this. And there are certainly situations where it could be conflicts of law. There's a real need in these various mechanisms in various countries that organizations need to be aware of to put in place and often contractural: passing on the contractual requirements is a means to mitigate some of the risks that come with cross-border data transfers.
So this has been a you know, you've touched on an awful lot of kind of the major themes in the privacy space. If I'm the leader of a middle, medium sized business, you know, and I'm inferring from what you're saying that this isn't something that you just do. But this is more of a journey. Right? This is something that you will be doing and it will change and evolve. Where do I start that journey?
I mean, what are the very first steps that I need to take? What do I need to worry about today?
Yeah, that's such a big question. I think that starting with understanding what the what the needs and what the trend lines are here. So it's definitely in in many countries, most countries now a legal requirement to have a privacy program and to have privacy policies and procedures and everything that goes with that, I would urge people to look at it and not just as a regulatory compliance domain and think of it as a cost to think of it actually more strategically.
How does it relate to our business? What does our customer base sort of feel about this? And what could we do to help build this build trust through the collection and use of personal data in our business? And therefore, how should we approach? This is also some key trend lines, even if you are looking at it through just what are the legal requirements? There are some few key trends that I think are worth recognizing.
The first is there's a greater push for transparency. So that's the starting point of building trust. Being open and honest about this is the data we collect is how we use it. Here are the rights you have, that kind of thing. There's a big push for empowering users. I mean, you mentioned in your lead in that there's a balance of power issue with personal data and that's a dynamic that's playing out. And and arguably, you know, there's been a lot more power shifted to the organizations that do collect a lot more data.
But there's a trend line that's pulling some of that power back to the individual. So think about how you can empower your customers to have more control over their data, more access to it, to be actually able to see what you have relating to them and be able to request for deletion, for example.
And then there is just a trend around responsibility and really being accountable for all the data collection, putting the right procedures and controls in place in your organization. Often that is going to be boosted by using privacy vendors. I mean, there's not all of this has to be done yourself. You can actually use various technologies to help you with, for example, these data subject rights that are the key, that's the term and the European law for the ability for someone to have access to or delete their data or take a copy of their data.
There's now services out there, of course, to help with those sorts of things. So it's not something you need to do all yourself. You can turn to third parties to help you. So I think those would be the key three three trend lines, the transparency, the empowerment and the responsibility and design, an approach that solves for those three things.
So that that sounds quite reasonable. Of those three, would I be right to assume that the one that may come knocking on your door, that you wake up one day and instead of planning and planning something, you're responding to something that that one is these DSRs, these data subject requests where someone says, you know, hey, what information do you have about me? How are you using it? Who are you selling it to? And I want it deleted or changed. Is that likely the thing that's going to come knocking on your door?
Yeah, I think that's right. I mean, you need to proactively design a privacy program and an approach that deals with all of those things. But in terms of the thing you might be needing to react to: there's probably two of them. One of them. And the most prominent one is the one you've just talked about users in many, many countries now have the ability to request a copy of their data, to have data deleted, these various data privacy rights.
And if you're not responsive to those, if you don't have the mechanisms in place to be able to be responsive to those, some of them have short timelines, like the latest law in Brazil, for example, it's 15 days is all you have to be able to return a copy of data to individuals. It's thirty days in Europe, but if you're not set up to be able to respond to those things, it could get you in hot water very quickly.
I mentioned two things. The other one, of course is you learn of a data breach. There are now requirements to notify individuals in many cases now regulators of a data breach in a very short time frame. And that's the other thing that you need, could be incoming for you. You might hear about it through one of your vendors that they had a breach. But it's your users. You have accountabilities. So that's another area as well.
And would I be right in suspecting that when an average consumer receives notification of a data breach, that it's much more likely they're going to turn around and do a data subject request and say, OK, what information do you have? Yeah, I think that's that's that's that's a very valid point. You know, a lot of this actually starts with you understanding what all the data you have. You need that to be able to be fully responsive to these data subject requests.
You need to have a full map of what is the data in your organization and make sure that you are able to retrieve that in a timely fashion and pass it on. But you're right. Yes, if someone is receiving a notice that their data was breached, that is definitely a moment where a number of them would potentially want to know what was their data, what are the implications, what can I do to protect myself? So it does become a customer response challenge as well.
Well, that that topic of, you know, doing the inventory and building a map of where all the data is so that you, you know, understand what the exposure is and are prepared to respond to these data subject requests is, I suspect, a much deeper topic that merits a lot more time and to to discuss. And we will get to that on a future episode of the Privacy Impact Podcast. And Brendon, I thank you so much for sharing your wisdom and insight with us and helping us all be a little bit smarter on how to think about privacy.
Now, you're welcome, glad glad to join you and have this conversation. Well, that's all for this episode. Be sure to look out for future episodes in which we will continue to explore the many aspects of the privacy landscape and the pragmatic steps you can take to implement privacy policy and compliance and build high quality, trust-based relationships with your customers. The Privacy Impact Podcast is a production of Soveren. Soveren is a data protection compliance company delivering GDPR data subject request automation solutions for medium sized businesses.
You can go to Soveren.io to learn more about sovereign and download a free inventory map template to help you get an initial understanding about your privacy, health and potential weaknesses. Thank you for joining us. We look forward to seeing you again next time on the Privacy Impact Podcast.