Are you at risk for enforcement action related to data privacy?
In this episode, Lynn Molfetta and Matthew Bernstein discuss an enforcement action against the credit reference agency, Experian, for misuse of personal data. The UK regulator, the Information Commissioner’s Office (ICO), has ordered Experian to make fundamental changes to the way it handles people’s personal data within its direct marketing services.
LYNN MOLFETTA:
Hi, I’m Lynn Molfetta.
MATTHEW BERNSTEIN:
I’m Matthew Bernstein.
LYNN MOLFETTA:
We’re seasoned information governance practitioners working in the world’s toughest regulatory environments.
MATTHEW BERNSTEIN:
Welcome to Operationalizing Information Governance, a podcast where we analyze compliance issues related to the management of information.
LYNN MOLFETTA:
In each episode, we unpack a high profile event and share practical advice to help organizations avoid similar consequences.
Why was Experian targeted for misuse of personal data?
LYNN MOLFETTA:
In this episode, we’re discussing an enforcement action against the credit reference agency, Experian. The UK regulator, or the Information Commissioner’s Office—otherwise known as the ICO—is requiring Experian to make fundamental changes to the way it handles people’s personal data within its direct marketing services. This action followed a two-year investigation by the ICO into how Experian, Equifax, and TransUnion used personal data within their data marketing business, or data broking, as it’s referred to in the UK.
So, Matthew, can you clarify for us? What exactly is data broking? And why was this something the ICO investigated?
MATTHEW BERNSTEIN:
The data broking is the actual business. We might call it data marketing. The data broking was not, in the context of privacy, the misuse of information. There’s nothing illegal about selling data.
As a credit reference agency (or what we would call a credit rating agency), they collected information from people, some very personal information about financial accounts. Perfectly legit. They were selling that information to third parties who would use that in offline marketing activities. These activities were things like direct mail solicitation. And they didn’t tell the consumers from whom they collected information—that the consumers were happy to provide, in order to establish their credit rating—that they were actually going to take the information, repurpose it, and sell it for another purpose. And the ICO had a number of problems with that: they felt that both what Experian did and the way they did it violated a number of different aspects of privacy law, specifically with GDPR.
LYNN MOLFETTA:
I understand that all three of the credit reference agencies, Equifax, TransUnion, and Experian, were doing the same thing. Why was Experian singled out?
MATTHEW BERNSTEIN:
The ICO conducted an investigation into what we call credit rating agencies, those three. And I believe that they went to all three agencies and said: “Here’s what we don’t like, here are the activities we are critical of.” And I believe Equifax and TransUnion agreed to make changes.
Experian disputed what the commissioner said and did not want to change their business. They felt they had grounds to disagree with the commissioner and therefore not change their business. The commissioner persisted and said to Experian that they would not take those objections on board. The ICO rendered the enforcement action and told Experian that they would have to make those changes because Experian’s defense was insufficient. It didn’t pass muster, in the eyes of the ICO.
What was the impact of the ICO’s order to Experian?
LYNN MOLFETTA:
Many organizations are getting fines for data privacy violations. It seems that since Experian didn’t get fined, so they got off easy.
MATTHEW BERNSTEIN:
The ICO actually said in their order that they chose to take this action because they thought it would be more effective and send a message rather than a fine. Now, they still reserve the right to impose a fine.
LYNN MOLFETTA:
Right.
MATTHEW BERNSTEIN:
But what I thought was interesting is that they told Experian specifically what to stop doing. Now, one would assume that if you impose a very large fine on someone, they’re going to stop doing that thing because they don’t want to get the fine over and over again. The ICO chose not to do that, perhaps because they thought a very large fine would be disproportionate. The thing that struck me was that they went ahead and said, we’re not going to fine you, but we are ordering you to change the way you do business.
LYNN MOLFETTA:
And what’s interesting is that, Experian saw the competitive agencies, Equifax and TransUnion, go ahead and say, “yes, we will stop this.” Why would Experian think they’d be able to dispute the ICO and continue? You would have thought that they’d follow suit when they saw very easily that their partner companies automatically did that.
MATTHEW BERNSTEIN:
One of the things that the ICO said in their findings about Experian, specifically, was that Experian conducted assessments. So, the GDPR requires that when you engage in a new product or certain processes, you have to conduct an assessment as to whether or not that new activity will affect the fundamental privacy rights of consumers.
The ICO noted that none of the assessments that Experian conducted came to the conclusion that Experian was violating people’s rights or that they hadn’t given proper notice. In other words, none of the assessments that Experian conducted suggested that Experian felt they had to change what they were doing once they examined that new process in the light of privacy law.
It’d be interesting to know whether TransUnion and Equifax have a different process of conducting assessments, or if they were quicker to realize that the activity that the ICO was interested in was going to lead to a bad conclusion.
MATTHEW BERNSTEIN:
So, did TransUnion and Equifax respond more quickly because they just didn’t want to fight with the ICO or they thought the ICO had a better argument? Or because they were like, yeah, our assessment process would have and perhaps should have come to the same conclusion.
It’s an interesting question and it certainly raises an operational issue. You need to conduct these assessments not as rubber stamp or check-the-box exercises. You need to actually alert your organization. You need to raise awareness in your organization of what is personal data, and if you’re doing something with it that you haven’t done before, stop and seek guidance.
How firms can avoid misuse of personal data
LYNN MOLFETTA:
Lots of times, the legal experts have a difficult time interpreting what the law really is requiring companies to do. And as a result of that, they don’t go so far as, really, operationalizing the actual regulation.
They’ll create policies and they’ll try to identify the PII within their organization. But taking it to the next level—to actually do those assessments and produce those processes that will help make those decisions more effectively—they seem to stop right after the policies. And they think “unless I get a fine, it’s just too hard to do, so I’m just going to continue operating the way I’ve been operating.”
MATTHEW BERNSTEIN:
Yeah. And I think part of the reason is because data privacy regulations—let’s just use GDPR as an example—are very expansive and seemingly prescriptive, yet open to interpretation. And at this point in time, a lot of the cases have been about consent and less about the use [of personal data].
On the consent side, the organization has to be aware of what activities it’s engaging in, or thinking of engaging in, that will require consent. And some of it’s very obvious. You’re already in a business. You’re collecting lots of consumer data. You know what you’re doing already. You should be reviewing it and understanding. So, what a lot of people are doing is the obvious: finding PII and then assessing how they collect it and then translating that into information [for the consumer] via a privacy policy that’s published, let’s say, on their website. And it’s seeking consent where they need it.
MATTHEW BERNSTEIN:
But the second step is: an organization is not static. It develops new products. It undertakes new processes and all that. So, how do you make sure that because the data and the processes are not static, that the organization remains vigilant on an ongoing basis?
LYNN MOLFETTA:
What’s the fundamental operational lesson here? How can other organizations avoid the same fate?
MATTHEW BERNSTEIN:
You’ve got to undertake the same kind of awareness training that you do for other compliance obligations that an organization has. Laws, lots of laws, change. If you’re a financial services company, you’re constantly updating your employees about compliance and what they have to do. There’s annual training, there’s attestations, there are new policies.
And that’s not true just in compliance. That’s not true just in financials, particularly when it comes to things like anti-bribery and corruption and things like that or conflicts of interest. Any large company is going to alert its employees to those potential issues. So, you have to introduce that into the mix.
MATTHEW BERNSTEIN:
The second thing that I would say is: think about your technology development process.
We now have a well-established—almost ingrained—set of principles: that if you’re introducing new technology or changing technology, there are business continuity and cybersecurity risks that have to be considered before you release something into the organization. People often refer to that as non-functional requirements.
So, we need to bring privacy and information governance more broadly into that concept of being a non-functional requirement. Because so often, a new way of processing personal data is going to arrive with a new technology, some change in technology. Not always. But when somebody’s building a new piece of technology, it’s a good thing if they don’t just say, “What are the security implications of this? What are the business continuity implications of this?” But also, “What are the privacy and information governance implications of this?”
That’s another control, if you will, on making sure that before something is put into practice, that the question about consent— is it necessary?—has been raised.
LYNN MOLFETTA:
Compliance with data privacy regulations is a continuous process and it’s beyond notice and consent.
MATTHEW BERNSTEIN:
Traditionally, privacy, particularly in the US, has been dealt with through notice and consent, disclosure and consent. So, we work, as a society, where people get to choose what they want to do. They’re not told that they can’t engage in what some other society might regard as risky behavior. As long as they understand what that behavior is and consent to it. That’s been the way we’ve worked with privacy: notice and consent.
I think everyone would say that it is impossible, as a consumer, to understand what those privacy notices that you read (even the most concise and clearly written ones) mean about what that company is going to do with your data. So, even if you’d read those things, which are typically two or three pages long, you can get through the privacy notice and the privacy notice to say, we use your data for these three things and they’re very broad. And click here if you want more or you go into the privacy policy (it’s three pages long).
MATTHEW BERNSTEIN:
And you’re encountering these things, these terms, constantly. You go to buy something on a new website, you open an account with a new bank, you read a new magazine online, it’s constant. Every day you probably interact with a new provider of services across the internet and I think everybody recognizes that consumers can no longer be expected to have really informed consent. And the GDPR still works on the idea that you have to provide informed consent.
But it breaks down when you get to an Experian-type situation, because the company argues that we did give them consent and the regulator says they didn’t. Then it just becomes really difficult, I think. And so, this is an opinion, to expect companies to be able to come up with something that is both legible and complete and accurate, so everybody can operate under the notice and consent model—I just don’t see how it works.
The role of operationalizing information governance
LYNN MOLFETTA:
What we’re talking about here is operationalizing information governance. It’s an operating model that consists of people, processes, technology, and governance.
People are the roles and the responsibilities that organizations have to define.
You talk about the processes – we talked early about it – you’re enhancing your processes and informing businesses that they have to manage their data to comply with these regulations differently. Then you’ve got to have documented processes that you train and you provide for employees and managers to follow.
You need the technology to know where your data is and to store it accurately. And making sure you’ve got repeatable processes that are not ad hoc; they’re mature and they’re sustainable and you could speak to them with the regulators to say that you’ve got a consistent way of doing that.
And then, of course, the governance. You’ve got to have the right controls to see if all of this is working. You can push out all of these individual components but if you’re not checking to see if they’re working, you could run the risk of saying, well, I have all this but no one’s adhering to it. Or the technology got upgraded and when we didn’t implement a control for it to continue to work well. It’s very much the same with a lot of other risk and business priorities that businesses adhere to and this is a big one.
Operationalizing Information Governance is brought to you by MC Bernstein Data, dedicated to helping firms achieve their objectives related to information governance, with services that leverage comprehensive knowledge of regulatory requirements, processes, tools, and best practices to reduce risks and organizational burdens.