Tag Archives: privacy by design

[Podcast] Dr. Ann Cavoukian on Privacy By Design

[Podcast] Dr. Ann Cavoukian on Privacy By Design

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


I recently had the chance to speak with former Ontario Information and Privacy Commissioner Dr. Ann Cavoukian about big data and privacy. Dr. Cavoukian is currently Executive Director of Ryerson University’s Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design (PbD).

What’s more, she came up with PbD language that made its way into the GDPR, which will go into effect in 2018. First developed in the 1990s, PbD addresses the growing privacy concerns brought upon by big data and IoT devices.

Many worry about PbD’s interference with innovation and businesses, but that’s not the case.

When working with government agencies and organizations, Dr. Cavoukian’s singular approach is that big data and privacy can operate together seamlessly. At the core, her message is this: you can simultaneously collect data and protect customer privacy.

Transcript

Cindy Ng
With Privacy by Design principles codified in the new General Data Protection Regulation, which will go into effect in 2018, it might help to understand the intent and origins of it. And that’s why I called former Ontario Information and Privacy Commissioner, Dr. Ann Cavoukian. She is currently Executive Director of Ryerson University’s Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design. When working with government agencies and organizations, Dr. Cavoukian’s singular approach is that big data and privacy can operate together seamlessly. At the core, her message is this, you can simultaneously collect data and protect customer privacy.

Thank you, Dr. Cavoukian for joining us today. I was wondering, as Information and Privacy Commissioner of Ontario, what did you see what was effective when convincing organizations and government agencies to treat people’s private data carefully?

Dr. Cavoukian
The approach I took…I always think that the carrot is better than the stick, and I did have order-making power as Commissioner. So I had the authority to order government organizations, for example, who were in breach of the Privacy Act to do something, to change what they were doing and tell them what to do. But the problem…whenever you have to order someone to do something, they will do it because they are required to by law, but they’re not gonna be happy about it, and it is unlikely to change their behavior after that particular change that you’ve ordered. So, I always led with the carrot in terms of meeting with them, trying to explain why it was in both their best interest, in citizens’ best interest, in customers’ best interest, when I’m talking to businesses. Why it’s very, very important to make it…I always talk about positive sum, not zero sum, make it a win-win proposition. It’s gotta be a win for both the organization who’s doing the data collection and the data use and the customers or citizens that they’re serving. It’s gotta be a win for both parties, and when you can present it that way, it gives you a seat at the table every time. And let me explain what I mean by that. Many years ago I was asked to join the board of the European Biometrics Forum, and I was honored, of course, but I was surprised because in Europe they have more privacy commissioners than anywhere else in the world. Hundreds of them, they’re brilliant. They’re wonderful, and I said, “Why are you coming to me as opposed to one of your own?” And they said, “It’s simple.” They said, “You don’t say ‘no’ to biometrics. You say ‘yes’ to biometrics, and ‘Here are the privacy protective measures that I insist you put on them.'” They said, “We may not like how much you want us to do, but we can try to accommodate that. But what we can’t accommodate is if someone says, ‘We don’t like your industry.'” You know, basically to say “no” to the entire industry is untenable. So, when you go in with an “and” instead of a “versus,” it’s not me versus your interests. It’s my interests in privacy and your interests in the business or the government, whatever you’re doing. So, zero sum paradigms are one interest versus another. You can only have security at the expense of privacy, for example. In my world, that doesn’t cut it.
Cindy Ng
Dr. Cavoukian, can you tell us a little bit more about Privacy by Design?
Dr. Cavoukian
I really crystallized Privacy by Design really after 9/11, because at 9/11 it became crystal clear that everybody was talking about the vital need for public safety and security, of course. But it was always construed as at the expense of privacy, so if you have to give up your privacy, so be it. Public safety’s more important. Well, of course public safety is extremely important, and we did a position piece at that point for our national newspaper, “The Globe and Mail,” and the position I took was public safety is paramount with privacy embedded into the process. You have to have both. There’s no point in just having public safety without privacy. Privacy forms the basis of our freedoms. You wanna live in free democratic society, you have to be able to have moments of reserve and reflection and intimacy and solitude. You have to be able to do that.
Cindy Ng
Data minimalization is important, but what do you think about companies that do collect everything with hopes that they might use it in the future?
Dr. Cavoukian
See, what they’re asking for, they’re asking for trouble, because I can bet you dollars to doughnuts that’s gonna come back to bite you. Because, especially with data, that you’re not clear about what you’re gonna do with it, so you got data sitting there. What data does is in identifiable form is attracts hackers. It attracts rogue employees on the inside who will make inappropriate use of the data, sell the data, do something with the data. It just…you’re asking for trouble, because keeping data in identifiable form, once the uses have been addressed, just begs trouble. I always tell people, if you wanna keep the data, keep the data, but de-identify it. Strip the personal identifiers, make sure you have the data aggregated, de-identified, encrypted, something that protects it from this kind of rogue activity. And you’ve been reading lately all about the hackers who are in, I think they were in the IRS for God’s sakes, and they’re getting in everywhere here in my country. They’re getting into so many databases, and it’s not only appalling in terms of the data loss, it’s embarrassing for the government departments who are supposed to be protecting this data. And it fuels even additional distrust on the part of the public, so I would say to companies, “Do yourself a huge favor. You don’t need the data, don’t keep it in identifiable form. You can keep it in aggregate form. You can encrypt it. You can do lots of things. Do not keep it in identifiable form where it can be accessed in an unauthorized manner, especially if it’s sensitive data.” Oh my god, health data…Rogue employees, we have a rash of it here, where…and it’s just curiosity, it’s ridiculous. The damage is huge, and for patients, and I can tell you, I’ve been a patient in hospitals many times. The thought that anyone else is accessing my data…it’s so personal and so sensitive. So when I speak this way to boards of directors and senior executives, they get it. They don’t want the trouble, or I haven’t even talked costs. Once these data breaches happen these days, it’s not just lawsuits, they’re class action lawsuits that are initiated. It’s huge, and then the damage to your reputation, the damage to your brand, can be irreparable.
Cindy Ng
Right. Yeah, I remember Meg Whitman said something about how it takes years and years to build your brand and reputation and seconds ruined.
Dr. Cavoukian
Yeah, yes. That is so true. There’s a great book called “The Reputation Economy” by Michael Fertik. He’s the CEO of reputation.com. It’s fabulous. You’d love it. It’s all about exactly how long it takes to build your reputation, how dear it is and how you should cherish it and go to great lengths to protect it.
Cindy Ng
Can you speak about data ownership?
Dr. Cavoukian
You may have custody and control over a lot of data, your customer’s data, but you don’t own that data. And with that custody and control comes an enormous duty of care. You gotta protect that data, restrict your use of the data to what you’ve identified to the customer, and then if you wanna use it for additional purposes, then you’ve gotta go back to the customer and get their consent for secondary uses of the data. Now, that rarely happens, I know that. In Privacy by Design, one of the principles talks about privacy as the default setting. The reason you want privacy to be the default setting…what that means is if a company has privacy as the default setting, it means that they can say to their customers, “We can give you privacy assurance from the get-go. We’re collecting your information for this purpose,” so they identify the purpose of the data collection. “We’re only gonna use it for that purpose, and unless you give us specific consent to use it for additional purposes, the default is we won’t be able to use it for anything else.” It’s a model of positive consent, it gives privacy assurance, and it gives enormous, enormous trust and consumer confidence in terms of companies that do this. I would say to companies, “Do this, because it’ll give you a competitive advantage over the other guys.”

As you know, because you sent it to me, the Pew Research Center, their latest study on Americans’ attitudes, you can see how high the numbers are, in the 90 percents. People have had it. They want control. This is not a single study. There have been multiple surveys that have come out in the last few months like this. Ninety percent of the public, they don’t trust the government or businesses or anyone. They feel they don’t have control. They want privacy. They don’t have it, so you have, ever since, actually, Edward Snowden, you have the highest level of distrust on the part of the public and the lowest levels of consumer confidence. So, how do we change that? So, when I talk to businesses, I say, “You change that by telling your customers you are giving them privacy. They don’t even have to ask for it. You are embedding it as the default setting which means it comes part and parcel of the system.” They’re getting it. I do what I call my neighbors test. I explain these terms to my neighbors who are very bright people, but they’re not in the privacy field. So, when I was explaining this to my neighbor across the street, Pat, she said, “You mean, if privacy’s the default, I get privacy for free? I don’t have to figure out how to ask for it?” And I said, “Yes.” She said, “That’s what I want. Sign me up!”

See, people want to be given privacy assurance without having to go to the lengths they have to go to now to find the privacy policy, search through the terms of service, find the checkout box. I mean, it’s so full of legalese. It’s impossible for people to do this. They wanna be given privacy assurance as the default. That’s your biggest bet if you’re a private-sector company. You will gain such a competitive advantage. You will build the trust of your customers, and you will have enormous loyalty, and you will attract new opportunity.

Cindy Ng
What are your Privacy by Design recommendations for wearables and IoT innovators and developers?
Dr. Cavoukian
The internet of things, wearable devices and new app developers and start up…they are clueless about privacy, and I’m not trying to be disrespectful. They’re working hard, say an app developer, they’re working hard to build their app. They’re focused on the app. That’s all they’re thinking about, how to deliver what the app’s supposed to deliver on. And then you say, “What about privacy?” And they say, “Oh, don’t worry about it. We’ve got it taken care of. You know, the third-party security vendor’s gonna do it. We got that covered.” They don’t have it covered, and what they don’t realize is they don’t know they don’t have it covered. “Give it to the security guys and they’re gonna take care of it,” and that’s the problem. When I speak to app developers…I was at Tim O’Reilly’s Web 2.0 last year or the year before, and there’s 800 people in the room, I was talking about Privacy by Design, and I said, “Look, do yourself a favor. Build in privacy. Right now you’re just starting your app developing, build it in right now at the front end, and then you’re gonna be golden. This is the time to do it, and it’s easy if you do it up front.” I had dozens of people come up to me afterwards because they didn’t even know they were supposed to. It had never appeared on their radar. It’s not resistance to it. They hadn’t thought of it. So our biggest job is educating, especially the young people, the app developers, the brilliant minds. My experience, it’s not that they resist the messaging, they haven’t been exposed to the messaging. Oh, I should just tell you, we started Privacy by Design certification. We’ve partnered with Deloitte and I’ll send you the link and we’re, Ryerson University, where I am housed, we are offering this certification for Privacy by Design. But my assessment arm, my audit arm, my partner, is Deloitte, and we’re partnering together, and we’ve had a real, real, just a deluge of interest.
Cindy Ng
So, do you think that’s also why people are also hiring Chief Privacy Officers?
Dr. Cavoukian
Yes.
Cindy Ng
What are some qualities that are required in a Chief Privacy Officer? Is it just a law background?
Dr. Cavoukian
No, in fact, I’m gonna say the opposite, and this is gonna sound like heresy to most people. I love lawyers. Some of my best friends are lawyers. Don’t just restrict your hiring of Chief Privacy Officers to lawyers. The problem with hiring a lawyer is they’re understandably going to bring a legal regulatory compliance approach to it, which, of course, you want that covered. I’m not saying…You have to be in compliance with whatever legislation is in your jurisdiction. But if that’s all you do, it’s not enough. I want you to go farther. When I ask you to do Privacy by Design, it’s all about raising the bar. Doing technical measures such as embedding privacy into the design that you’re offering into the data architecture, embedding privacy as a default setting. That’s not a legalistic term. It’s a policy term. It’s computer science. It’s a… You need a much broader skill set than law alone. So, for example, I’m not a lawyer, and I managed to be Commissioner for three terms. And I certainly valued my legal department, but I didn’t rely on it exclusively. I always went farther, and if you’re a lawyer, the tendency is just to stick to the law. I want you to do more than that. You have to have an understanding of computer science, technology, encryption, how can you… De-identification protocols are critical, combined with the risk of re-identification framework. When you look at the big data world, the internet of things, they’re going to do amazing things with data. Let’s make sure it’s strongly de-identified and resist re-identification attacks.
Cindy Ng
There have been reports that people can re-identify people without data.
Dr. Cavoukian
That’s right, but if you examine those reports carefully, Cindy, a lot of them are based on studies where the initial de-identification was very weak. They didn’t use strong de-identification protocols. So, like anything, if you start with bad encryption, you’re gonna have easy decryption. So, it’s all about doing it properly at the outset using proper standards. There’s now four standards of de-identification that have all come out that are risk-based, and they’re excellent.
Cindy Ng
Are you a fan of possibly replacing privacy policies with something simpler, like a nutrition label?
Dr. Cavoukian
It’s a very clever idea. They have tried to do that in the past. It’s hard to do, and I think your simplest one for doing the nutrition kinda label would be if you did embed privacy as the default setting. Because then you could have a nutrition label that said, “Privacy built in.” You know how, I think, Intel had something years ago where you had security built it or something. You could say, “Privacy embedded in the system.”

What We Learned From Talking to Data Security Experts

What We Learned From Talking to Data Security Experts

Since we’ve been working on the blog, Cindy and I have chatted with security professionals across many different areas — pen testers, attorneys, CDOs, privacy advocates, computer scientists, and even a guru. With the state of security looking more unsettled than ever, we decided it was a good time to take stock of the collective wisdom we’ve absorbed from these pros.

The Theory of Everything

A good place to begin our wisdom journey is the Internet of Things (IotT). It’s where the public directly experiences all the issues related to privacy and security.

We had a chance to talk to IoT pen tester Ken Munro earlier this year, and his comments on everything from wireless coffee pots and doorbells to cameras really resonated with us:

“You’re making a big step there, which is assuming that the manufacturer gave any thought to an attack from a hacker at all. I think that’s one of the biggest issues right now is there are a lot of manufacturers here and they’re rushing new product to market …”

IoT consumer devices are not, cough, based on Privacy by Design (PbD) principles.

And over the last few months, consumers learned the hard way that these gadgets were susceptible to simple attacks that exploited backdoors, default passwords, and even non-existent authentication.

Additional help to the hackers was provided by public-facing router ports left open during device installation, without any warning to the poor user, and unsigned firmware that left their devices open to complete takeover.

As a result, IoT is where everything wrong with data security seems to show up. However, there are easy-to-implement lessons that we can all put into practice.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Password Power!

Is security always about passwords? No, of course not, but poor passwords or password defaults that were never reset seem to show up as a root cause in many breaches.

The security experts we’ve spoken to have, without prompting from us, often bring up the sorry state of passwords. One of them, Per Thorsheim, who is in fact a password expert himself, reminded us that one answer to our bad password habits is two-factor authentication (TFA):

“From a security perspective, adding this two-factor authentication, is everything. It increases security in such a way that in some cases even if I told you my password for my Facebook account, as an example, well because I have two –factor authentication, you won’t be able to log in.  As soon as you type in my user name and password, I will be receiving a code by SMS from Facebook on my phone, which you don’t have access to. This is really good.”

We agree with Thorsheim that humans are generally not good at this password thing, and so TFA and biometric authentication will certainly be a part of our password future.

In the meantime, for those of who still cling to just plain-old passwords, Professor Justin Cappos told us awhile back that there’s a simple way to come up with better password generation:

“If you’re trying to generate passwords as a human, there are tricks you can do where you pick four dictionary words at random and then create a story where the words interrelate. It’s called the “correct horse battery staple” method! “

Correct-horse-battery-staple is just a way of using a story as a memory trick or mnemonic. It’s an old technique but which one helps create crack-proof passwords.

One takeaway from these experts: change your home router admin passwords now (and use horse-battery-staple).  Corporate IT admins should also take a good, hard look at their own  passwords and avoid aiding and abetting hackers

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Cultivate (Privacy and Security) Awareness

Enabling TFA on your online accounts and generating better passwords goes a very long way to improving your security profile.

But we also learned that you need to step back and cultivate a certain level of privacy awareness in your online transactions.

We learned from attorney and privacy guru Alexandra Ross about the benefits of data minimization, both for the data companies that collect and the consumers who reveal their data:

“One key thing is to stop, take a moment, and be mindful of what’s going on. What data am I being asked to submit when I sign up for a social media service?  And question why it’s being asked.

It’s worth the effort to try to read the privacy policies, or read consumer reviews of the app or online service.”

And

“If you’re speaking to the marketing team at a technology company—yeah, the default often is let’s collect everything. In other words, let’s have this very expansive user profile so that every base is covered and we have all these great data points.

But if you explain, or ask questions … then you can drill down to learn what’s really necessary for the data collection.”

In a similar vein, data scientist Kaiser Fung pointed out that often there isn’t much of a reason behind some of the data collection in the first place:

“It’s not just the volume of data, but that the fact that the data today is not collected without any design or plan in mind. Often times, people collecting the data are really divorced from any kind of business problem.”

Listen up IT and marketing people: think about what you’re doing before you submit your next contact form!

Ross and other PbD advocates preach the doctrine of data minimization: the less data you have, the lower your security risk is when there’s an attack.

As our privacy guru, Ross reminded us that there’s still lot of data about us spread out in corporate data systems.  Scott “Hacked Again” Schober another security pro we chatted with makes the same point based on his personal experiences:

“I was at an event speaking … and was asked if I’d be willing to see how easy it is to perform identity theft and compromise information on myself. I was a little reluctant but I said ok, everything else is out there already, and I know how easy it is to get somebody’s information. So I was the guinea pig. It was Kevin Mitnick, the world’s most famous hacker, who performed the theft. Within 30 seconds and at the cost of $1, he pulled up my social security number.”

There’s nothing inherently wrong with companies storing personal information about us. The larger point is to be savvy about what you’re being asked to provide and take into account that corporate data breaches are a fact of life.

Credit cards can be replaced and passwords changed but details about our personal preferences (food, movies, reading habits) and our social security numbers are forever and a great of source raw material for hackers to use in social engineered attacks.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Data is Valuable

We’ve talked to attorneys and data scientists, but we had the chance to talk to both in the form of Bennett Borden. His bio is quite interesting: in addition to being a litigator at Drinker Biddle, he’s also a data scientist. Borden has written law journal articles about the application of machine learning and document analysis to e-discovery and other legal transactions.

Borden explained how as employees we all leave a digital trail in the form of emails and documents, which can be quite revealing. He pointed out that this information can be useful when lawyers are trying to work out a fair value for a company that’s being purchased.

He was called in to do a data analysis for a client and was able to show that internal discussions indicated the asking price for the company was too high:

“We got millions of dollars back on that purchase price, and we’ve been able to do that over and over again now because we are able to get at these answers much more quickly in electronic data.”

So information is valuable in a strictly business sense. At Varonis, this is not news to us, but still it’s still powerful to hear someone who is immersed in corporate content as part of his job to tell us this.

To summarize, as consumers and as corporate citizens, we should all be more careful about treating this valuable substance: don’t give it away easily, and protect if it’s in your possession.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


More Than Just a Good Idea

Privacy by Design came up in a few of our discussions with experts, and one of its principles, privacy as a default setting, is a hard one for companies to accept. Although PbD says that privacy is not a zero-sum game — you can have tough privacy controls and profits.

In any case, for companies that do business in the EU, PbD is not just a good idea but in fact it will be the law in 2018. The concept is explicitly spelled out in the General Data Protection Regulation’s (GDPR) article 25, “Data protection by design and by default”.

We’ve been writing about the GDPR for the last two years and of its many implications. But one somewhat overlooked consequence is that the GDPR will apply to companies outside of the EU.

We spoke with data security compliance expert Sheila FitzPatrick, who really emphasized this point:

“The other part of GDPR that is quite different–and it’s one of the first times that this idea will be put in place– is that it doesn’t just apply to companies that have operations within the EU. Any company regardless of where they are located and regardless of whether or not they have a presence in the EU, if they have access to the personal data of any EU citizen, they will have to comply with the regulations under the GDPR. That’s a significant change.”

This legal idea is sometimes referred to as extraterritoriality. And US e-commerce and web service companies in particular will find themselves under the GDPR when EU citizens interact with them. IT best practices that experts like to talk about as things you should do are becoming legal requirements for them.  It’s not just a good idea!

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


EU GDPR Spotlight: Protection by Design and Default

EU GDPR Spotlight: Protection by Design and Default

Privacy by Design (PbD) is a well-intentioned set of principles – see our cheat sheet – to get the C-suite to take consumer data privacy and security more seriously. Overall, PbD is a good idea and you should try to abide by it. But with the General Data Protection Regulation (GDPR), it’s more than that: it’s the law if you do business in the EU zone!

PbD has sensible guidelines and practices concerning consumer access to their data, and making privacy policies open and transparent.  These are not controversial ideas, except if you are, ahem, a large Internet company that collects lots of consumer data.

And PbD also dispenses good general advice on data security that can be summarized in one word: minimize.

Minimize collection of consumer data, minimize who you share the data with, and minimize how long you keep it. Less is more: less data for the hacker to take, means a more secure environment.

By Design and By Default

While you’re keeping consumer data, according to PbD, you also should have “end-to-end” security in place. Privacy is supposed to be baked into every system that handles the data.

It all seems like reasonable things to do. Various security best practices and standards — for example, PCI DSS and CIS Critical Security Controls— have been offering similar PbD-like security recommendations.

However, the EU has been way ahead of the US in making PbD principles part of their data regulations. In fact, the existing Data Protection Directive, the current law, contains PbD principles in various place – particularly data minimization and giving consumers the right to access and correct their data.

The new GDPR, which will go into effect in 2018, retains the existing rules on data and then goes a step further. PbD is explicitly spelled out in article 25, “Data protection by design and by default”.  Here are two relevant passages:

… implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing…

The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage ..

Got that: limiting and minimizing are now the law of the land, with respect to data. (I’ll talk about pseuodonymization in another post. It’s a cool idea that lets you have protect data and consumer privacy without having to resort to encryption.)

Impact on Your Marketing Campaign

The new GDPR has direct, practical implications. Just as an example, consider the impact it will have on web-based marketing.

Businesses are always trying to get information about their customers and looking to bring in new leads using the full digital arsenal — web, email, mobile  And when given half a chance, marketers always want more data —age, income, zip code, last book read, favorite ice cream, favorite food, etc. — even for the simplest consumer interaction.

What the EU GDPR says is that marketers should limit data to the purpose for which it is being collected—do I really need zip codes or favorite books? — and not to retain the data beyond the point where it’s no longer relevant.

So the data points you collected from that web campaign over five years ago —maybe containing 5000 email address along with favorite pet names — and now lives in spreadsheet no one ever looks at. Well, you should find it and delete it.

If a hacker gets hold of it, and uses it for phishing purpose, you’ve created a security risk for your customers.

Plus, if the local EU authority can trace the breach back to your company, you can face heavy fines.

Need more EU General Data Protection Regulation knowledge? Our white paper  gives you a complete run down!

The EU General Data Protection Regulation Is Now Law. Here’s What You Nee...

The EU General Data Protection Regulation Is Now Law. Here’s What You Need to Know.

Updated: 6/2016

You are back in the office after the long holiday break and busy catching up. Did you miss the story about the EU’s General Data Protection Regulation (GDPR) receiving final approval?  Some are calling it a “milestone of the digital age”.

We’ve been following the GDPR on the blog over the last two years. If you want to catch up very quickly, read our omnibus post that’s a tasty distillation of our wisdom on this subject.

Or if you have some more time, check out our comprehensive GDPR white paper.

With the final draft, a few ambiguities and loose ends were ironed out from the different versions provided by the EU Parliament and the Council.

I’ve put together a few key points that should resonate with Inside Out readers. Keep in mind the GDPR will take effect in early 2018.

Fines

We have closure on the question of fines: the GDPR has a tiered fine structure.

For example, a company can be fined up to 2% of global revenue for not having their records in order (article 30), not notifying the supervising authority and data subject about a breach (articles 33, 34), or not conducting impact assessments (article 35).

More serious infringements merit up to a 4% fine. This includes violations of basic principles related to data security (article 5) and conditions for consumer consent (article 7) — these are essentially violations of the core Privacy by Design concepts of the law.

The EU GDPR rules apply to both data controllers and processors, that is “the cloud”.  (Refer to our white paper to learn more about this law’s data security terminology.) Therefore huge cloud providers are not off the hook when it comes to GDPR enforcement.

Data Protection Officer

It’s official: you’ll likely need a Data Protection Officer or DPO. You can read the fine print in article 37.

If the core activities of your company involve “systematic monitoring of data subjects on a larger scale”, or large-scale processing of “special categories” of data — racial or ethnic origin, political opinions, religious or philosophical beliefs, biometric data, health or sex life, or sexual orientation — then you’re required to have a DPO.

In the US, the closest job title to this is a Chief Privacy Officer.

In any case, the job function of the DPO includes advising on and monitoring GDPR compliance, as well as representing the company when contacting the supervising authority or DPA.

Data Breach Notification

24 or 72 hours?

And the winner is … 72.

Article 33 tells us that controllers are required to notify the appropriate supervisory authority of a personal data breach within 72 hours (at the latest) on learning about the exposure if it results in risk to the consumer. But even if the exposure is not serious, the company still has to keep the records internally.

According to the GDPR, accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to personal data – the EU’s term for PII — is considered a breach.

Note my emphasis on unauthorized.

Based on my understanding of the GDPR, this means that if an employee sees data that’s not part of their job description, it could be considered a breach.

Of course, this is not a problem for your company, because your IT department has done a thorough job reviewing file access lists and has implemented role-base access controls.

You can read more about what you have to provide to the data authority in our “What is the EU GDPR” post.

Bottom line: The GDRP notification is more than just saying you have had an incident.  You’ll have to include categories of data, records touched, and approximate number of data subjects affected. And this means you’ll need some detailed intelligence on what the hackers and insiders were doing.

Data processors have a little more wiggle room: they’re supposed to notify the company they’re doing the work for — the controller — “without undue delay”.

Under what conditions does a company have to tell the subject about the breach?

You can read the details in article 34, but if a company has encrypted the data or taken some other security measures that render the data unreadable, then they won’t have to inform the subject.

For Countries Outside the EU

We’ve been raising the alarms on extra-territoriality for several months now.

With the GDPR finalized, we can say with certainty the law applies to your company even if it merely markets goods or services in the EU zone.

In other words, if you don’t have a formal presence in the EU zone but collect and store the personal data of someone in the EU (not just citizens!), the long arm of the GDPR can reach out to you.

As many have pointed out, the extra-territoriality requirement (article 3) is especially relevant to ecommerce companies.

Social media forums, online apartment sharing services, artisanal craft sites, or beers of the world clubs: you’ve been warned!

Other Resources

All the permutations of the GDPR and how it can applies is just too complex to cover in a few blog posts.  Of course, your Data Privacy Officer is the go-to person for advice, along with outside legal experts.

Speaking of law firms and attorneys, they are understandably all over this.

Thankfully, they do offer some very practical and free information on their public-facing sites. Here are my own favorite legal advisors:

Varonis has data governance and data protection solutions that will help keep you in GDPR compliance. Learn  more today!

 

Want to learn more about the GDPR?

Check out our free 6-part email course (and earn CPE credits!)

Sign me up

Privacy by Design Cheat Sheet

Privacy by Design Cheat Sheet

Privacy by Design (PbD) has been coming up more and more in data security discussions. Alexandra Ross, the Privacy Guru, often brings it up in her consultations with her high tech clients. Its several core principles have been adopted by U.S. government agencies and others as de facto best practices polices.

PbD is about 20 years old and is the brainchild of Ann Cavoukian, formerly the Information & Privacy Commissioner of Ontario, Canada. Why haven’t we all heard more about it? PbD has been accused of being vague, too consumer-oriented, and not technical. Sure, it’s not a formal technical standard like ISO 27001 or PCI DSS.

Think of PbD as good solid advice to help guide your data security decisions. The security standards, as complex as some of them are, can’t cover every possible security scenario, and that’s where PbD can step in: it’s  like having a data security savvy friend you go to when you’re stuck on a problem.

The Seven Principles

Here are the PbD principles with some brief words on what they really mean:

1. Proactive not Reactive; Preventative not Remedial

The key idea behind this first principle is that you should think about data privacy at the beginning of the data security planning process —not after a data breach. Consider this principle as a kind of a mood setter for the rest of PbD.  Always be thinking privacy (ABTP)!

2. Privacy as the Default Setting

This is the hardest one for companies, especially in the high-tech world, to get their heads around. You’re supposed to give consumers the maximum privacy protection as a baseline: for example, explicit opt-in, safeguards to protect consumer data, restricted sharing, minimized data collection, and retention policies in place. Privacy by Default therefore directly lowers the data security risk profile: the less data you have, the less damaging a breach will be.

3. Privacy Embedded into Design

This is another tough one, especially for rapidly growing high-tech startups. Privacy is supposed to be embedded into the design of IT systems and business practices.  Talk to a typical software developer, and he’s most worried about completing core functionality for the product. Data security techniques such as encryption and authentication are usually put on the backburner in the rush to get features online. And testing for the most common hackable vulnerabilities in software—typically injection attacks—is also often neglected.  These principles tell designers that they should think about privacy as a core feature of the product.

4. Full Functionality – Positive-Sum, Not Zero-Sum

The idea here is that PbD will not compromise business goals. Basically, you can have privacy, revenue, and growth. You’re not sacrificing one for the other. Think of this one as helping to establish a PbD culture in your organization.

5. End-to-End Security – Full Lifecycle Protection

Privacy protections follow the data, wherever it goes. The same PbD principles apply when the data is first created, shared with others, and then finally archived. Appropriate encryption and authentication should protect the data till the very end when it finally gets deleted.

6. Visibility and Transparency – Keep it Open

This is the principle that helps build trust with consumers. Information about your privacy practices should be out in the open and written in non-legalese. There should be a clear redress mechanism for consumers, and lines of responsibility in the organization need to be established.

7. Respect for User Privacy – Keep it User-Centric

This final principle just makes it very clear that consumers own the data. The data held by the organization must be accurate, and the consumer must be given the power to make corrections. The consumer is also the only one who can grant and revoke consent on the use of the data.