Tag Archives: gdpr

How Varonis Helps with the California Consumer Privacy Act (CCPA)

How Varonis Helps with the California Consumer Privacy Act (CCPA)

The California Consumer Privacy Act (CCPA) is set to go into effect on January 1, 2020. It not only gives ownership and control of personal data back to the consumer but holds companies accountable for protecting that data.

What is the California Consumer Privacy Act?

The CCPA gives California residents four basic rights in relation to how companies collect and store their personal information:

  • Transparency: the right to know what personal information a company is collecting about them, where that data came from (including 3rd parties), how it’s used, whether or not it’s being sold, and with whom that data is being shared. This will likely be disclosed via privacy policies (that will be updated at minimum once a year) and on-demand via consumer request.
  • Opt-out: the right to refuse a company the ability to sell their personal data to third parties.
  • Right to be forgotten: the right to have a company delete their personal information.
  • No penalties for privacy: the right to receive equal service and pricing from a company, regardless of whether or not they exercise their privacy rights.

The CCPA requires that companies are able to identify what personal data they’re collecting from individuals, define why they’re collecting the data, and disclose how that data is used.

They’ll need to be able to delete or quarantine that information – and in a relatively short amount of time: companies will need to disclose any requested information within 45 days of the original request.

The CCPA underscores that security of consumer data is a priority, requiring companies to “safeguard California consumers’ personal information and holding them accountable if such information is compromised as a result of a security breach arising from the business’s failure to take reasonable steps to protect the security of consumers’ sensitive information.”1

How does the California Consumer Privacy Act define personal information?

The CCPA takes a broader definition of what constitutes personal information than many regulations–including the GDPR–which will likely have significant effects on business models from targeted advertising to data brokerage.

Broadly, it’s defined as information that can be used to identify a specific individual.

That includes not only personal identifiers like name, email address, postal address, IP address, license number, etc., but extends to biometric data, browsing history, geolocation, and more. The CCPA even includes any inferences drawn from any of the aforementioned data in the definition of personal information.

Who will be held accountable?

  • For-profit companies that collect California residents’ personal information
  • Companies that do business in the State of California,
  • and:
    • have annual gross revenues in excess of $25 million;
    • or receive or disclose the personal information of 50,000 or more California residents, households or devices on an annual basis;
    • or derive 50 percent or more of their annual revenues from selling California residents’ personal information.

What are the penalties?

Companies that don’t comply may be liable for penalties enforced by the California attorney general: up to $2,500 per violation that isn’t addressed within a 30-day window, and/or up to $7,500 per intentional violation.

Additionally, consumers have a right of action (private claim or class action) if their personal information is compromised in a data breach, no proof of harm necessary.

How does Varonis help with the CCPA?

In order to comply with the CCPA, companies need to be able to identify and discover personal information, fulfill data subject access requests, and protect consumer data:

  • Automatically discover and classify CCPA affected data
    Varonis can automatically discover, identify, and classify CCPA eligible data on-premises and in the cloud, and gives context around that data – so that you can more easily locate personal information, create reports with advanced classification criteria, and remediate security vulnerabilities.
  • Fulfill data subject access requests
    Search for data related to a data subject to fulfill public access requests: Varonis helps you locate relevant files, pinpoint exactly who has access, and enforce policies to move, quarantine, or delete personal information.
  • Protect consumer data
    Varonis protects data first, not last: combining data classification and access governance with UEBA and security analytics. With Varonis, companies can not only identify and monitor consumer data, but track who’s accessing it, spot unusual activity, and report on suspicious behavior on regulated and sensitive data.
  • Build a CCPA security policy to meet compliance
    Varonis helps companies build and enforce a data-centric security policy to help meet compliance, protect sensitive data, and prepare for the CCPA.

Varonis helps companies meet CCPA compliance requirements and build a unified data security strategy to protect consumer data.

Are you ready for the CCPA? Get a 1:1 demo and see how Varonis can help you discover, manage, and protect your CCPA data.

1https://www.caprivacy.org/

[Transcript] Attorney Sara Jodka on the GDPR and HR Data

[Transcript] Attorney Sara Jodka on the GDPR and HR Data

In reviewing the transcript of my interview with Sara Jodka, I realize again how much great information she freely dispensed. Thanks Sara! The employee-employer relationship under the GDPR is a confusing area. It might be helpful to clarify a few points Sara made in our conversation about the legitimate interest exception to consent, and the threshold for Data Protection Impact Assessments (DPIAs).

The core problem is that to process personal data under the GDPR you need to have freely-given consent. If you can’t get that, you have a few other options, which are covered in the GDPR’s Article 6.  For employees, consent can not be given freely, and so employers will most likely need to rely on “legitimate interest” exception referred to in that article.

There’s a little bit of paperwork required to prove that the employer’s interest overrides the employee’s rights. In addition, employers will have to notify the employees as to what data is being processed. Sara refers to the ICO, the UK’s data protection authority, and they have an informal guidance, which is worth reading, on the legitimate interest process.

Since the data collected by the employer is also from a vulnerable subject (the employee) and contains a special class of sensitive personal data (health, payroll, union membership, etc.), it meets the threshold set by GDPR regulators — see this guidance — for performing a DPIA. As we know, DPIAs require companies to conduct a formal risk analysis of their data and document it.

Sara reminds us that some US companies, particularly service-oriented firms, may be surprised to learn about the additional work they’ll need to undertake in order to comply with the GDPR. In short: employees, like consumers, are under the new EU law.

 

Inside Out Security: Sara Jodka is an attorney with Dickenson Wright in Columbus, Ohio. Her practice covers data privacy and cyber security issues. Sara has guided businesses through compliance matters involving HIPPA, Gramm-Leach-Bliley, FERPA, and COPPA, and most importantly for this podcast, certification under the US-EU Privacy Shield, which, of course, falls under the General Data Protection Regulation or GDPR.

A lot of abbreviations there! Welcome, Sara.

Sara Jodka: Thank you for having me.

IOS: I wanted to get into an article that you had posted on your law firm’s blog. It points out an interesting subcategory of GDPR personal data which doesn’t get a lot of attention, and that is employee HR records. You know, of course it’s going to include ethnic, payroll, 401(k), and other information.

So can you tell us, at a high level, how the GDPR treats employee data held by companies?

Employee Data Covered By the GDPR

SJ: Whenever we look at GDPR, there are 99 articles, and they’re very broad. There’s not a lot of detail on the GDPR regulations themselves. In fact, we only have one that actually carves employment data out, and that’s Article 88  — there’s one in and of itself.

Whenever we’re looking at it, none of the articles say that all of these people have these rights. All these individuals have rights! None of them say, “Well, these don’t apply in an employment situation.” So we don’t have any exclusions!

We’re led to “Yes, they do apply.” And so we’ve been waiting on, and we have been working with guidances that we’re receiving, you know, from the ICO, with respect to ….  consent obligation, notice obligation, portability requirements, and  any employee context. Because it is going to be a different type of relationship than the consumer relationship!

IOS: It’s kind of interesting that people, I think, or businesses, probably are not aware of this … except those who are in the HR business.

So I think there’s an interesting group of US companies that would find themselves under these GDPR rules that probably would not have initially thought they were in this category because they don’t collect consumer data. I’m thinking of law firms, investment banking, engineering, professional companies.

US Professional Service Companies Beware!

SJ: I think that’s a very good point! In fact, that’s where a lot of my work is actually coming from. A lot of the GDPR compliance is coming from EU firms that specialize with EU privacy. But a lot of U.S. companies didn’t realize that this is going to cover their employment aspects that they had with EU employees that are in the EU!

They thought, “Well, because we don’t actually have a physical location EU, it doesn’t actually cover us.” That’s not actually at all true.
The GDPR covers people that are working in the EU, people who reside in the EU, so to the extent that U.S. company has employees that are working in the EU it is going to cover that type of employee data. And there’s no exception in the GDPR around it. So it’s going to include those employees.

IOS: So I hadn’t even thought about that. So their records would be covered under the GDPR?

SJ: Yeah, the one thing about the definition of a data subject under the GDPR is it doesn’t identify that it has to be an EU resident or it has to be an EU citizen. It’s just someone in the EU.

When you’re there, you have these certain rights that are guaranteed. And that will cover employees that are working for U.S. companies but they’re working in the EU.

IOS: Right.  And I’m thinking perhaps of a U.S. citizens who come there for some assignment, and maybe working out of the office, they would be covered under these rules.

SJ: And that’s definitely a possibility, and that’s one thing that we’ve been looking for. We’ve been looking for looking for guidance from the ICO to determine …  the scope of what this is going to look not only in an employment situation, but we’re dealing with an immigration situation, somebody on a work visa, and also in the context of schools as we are having, you know, different students coming over to the United States or going abroad. And what protection then the GDPR applies to those kind of in-transition relationships, those employees or students.

With a lot of my clients, we are trying to err on the side of caution and so do things ahead of time, rather than beg forgiveness if the authorities come knocking at our door.

GDPR’s Legitimate Interest Exception is Tricky

IOS: I agree that’s probably a better policy, and that’s something we recommend in dealing with any of these compliance standards.

In that article, you mentioned that the processing of HR records has additional protections under the GDPR …  An employee has to give explicit or consent freely and not as part of an employer-employee contract.

GDPR’s Article 6 says there are only six lawful ways to process data. If you don’t obtain freely given consent, then it gets tricky.

Can you explain this? And then, what does an employer have to do to process employee data  especially HR data?

SJ: Well, when we’re looking at the reasons that we’re allowed to process data, we can do it by consent, and we can also do it if we have a lawful basis.

A number of the lawful bases are going to apply in the employer context. One of those is if there is going to be an agreement. You know, in order to comply with the terms of a contract, like a collective bargaining agreement or like an employment agreement. So hire/fire payroll data would be covered under that, also if there is … a vital interest of an employee.

There’s speculation that that exception might actually be, or that legitimate basis might be used to obtain vital information regarding, like, emergency contact information of employees.

And there’s also one of the other lawful basis is if the employer has a greater, you know, interest in the data that doesn’t outweigh the right of the data subject, the employee.

The issue though is most … when we talk about is consumer data, and we’re looking a lot at consent and what actually consent looks like in terms of the express consent, you know, having them, you know, check the box or whatever.

In an employee situation, the [UK’s] ICO has come out with guidance with respect to this. And they have expressly said in an employee-employer relationship, there is an inherent imbalance of bargaining power, meaning an employee can never really consent to giving up their information because they have no bargaining power. They either turn it over, or they’re not employed. The employer is left to rely only on the other lawful basis to process data, excluding consent, so the contractor allowance and some of the others.

But the issue I have with that is, I don’t think that that’s going to cover all the data that we actually collect on an employee, especially employees who are operating outside the scope of a collective bargaining agreement.

In a context of, say, an at-will employee where there is that … where that contract exception doesn’t actually apply. I think there will be a lot of collection of data that doesn’t actually fall under that. It may fall into the legitimate interest, if the employer has the forethought to actually do what’s required, which is to actually document the process of weighing the employer’s interest against the interest of the employee, and making sure that that is a documented process. [ Read the UK’s ICO guidelines on the process of working out legitimate interest.]

When employers claim a legitimate interest exception to getting employee consent, they have more work to do. [Source: UK ICO]

But also what comes with that is the notice requirement, and the notice requirement is something that can be waived. So employers, if they are doing that, are going to have to  — and this is basically going to cover every single employer — they’re going to have to give their employees notice of the data that they are collecting on them, at a minimum.

IOS: At a minimum. I think to summarize what you’re saying is it’s just so tricky or difficult to get what they call freely given consent, that most employers will rely on legitimate interest.

Triggers for Data Protection Impact Assessments (DPIAs)

IOS: In the second part of this interview, we joined Sara Jodka as she explains what triggers a data protection impact assessment, or DPIA when processing employee data.

SJ: I think that’s required when we’re doing requirements for sensitive data, and we’re talking about sensitive HR data. A DPIA has be performed when two of the following exist, and there’s like nine things that have to be  there in order for a DPIA to have to be done. But you bring up a great point because the information that an employer is going to have is going to necessarily trigger the DPIA. [See these Working Party 29 guidelines for the nine criteria that Sara refers to.]

The DPIA isn’t triggered by us doing the legitimate basis …
and having to document that process. It’s actually triggered because we process sensitive data. You know, their trade union organization, affiliation, their religious data, their ethnicity. We have sensitive information, which is one of the nine things that can trigger, and all you need is two to require a DPIA.

Another one that employers always get is they process data of a vulnerable data subject. A vulnerable data subject includes employees.

IOS: Okay. Right.

SJ:  I can’t imagine a situation where an employer wouldn’t have to do a DPIA. The DPIA is different than the legitimate interest outweighing [employee rights] documentation that has to be done. They’re two different things.

 

IOS: So, they will have to do the DPIAs? And what would that involve?

SJ: Well, it’s one thing that’s required for high-risk data processing and that, as we just discussed, includes the data that employer has.

Essentially what a DPIA is, it’s a process that is designed to describe what processing the employer has, assess the necessity on proportionality to help manage the risk to the rights and the freedoms of the national persons resulting from the processing of personal data by assessing and determining the measures to address the data and the protections around it.

It’s a living document, so one thing to keep in mind about DPIA is they’re never done. They are going to be your corporation’s living document of the high-risk data you have and what’s happening with it to help you create tools for accountability and to comply with the GDPR requirements including, you know, notice to data subject, their rights, and then enforcing those rights.

It’s basically a tracking document … of the data, where the data’s going, where the data lives, and what happens with the data and then what happens when somebody asks for their data, wants to erase their data, etc.

GDPR Surprises for US Companies

IOS: Obviously, these are very tricky things and you definitely need an attorney to help you with it. So, can you comment on any other surprises U.S. companies might be facing with GDPR?

SJ: I think one of the most interesting points, whenever I was doing my research, to really drill down, from my knowledge level, is you’re allowed to process data so long as it’s compliant with a law. You know, there’s a legal necessity to do it.

And a lot of employers, U.S employers specifically, look at this and thought, “Great, that legal requirement takes the load off of me because I need, you know, payroll records to comply with the Fair Labor Standards Act and, you know, state wage laws. I need my immigration information to comply with the immigration control format.”

You know, they were like, “We have all these U.S. laws of why we have to retain .information and why we have to collect it.” Those laws don’t count, and I think that’s a big shock when I say, well, those laws don’t count.

We can’t rely on U.S. laws to process EU data!

We can only rely on EU laws and that’s one thing that’s brought up and kind of coincides with Article 88, which I think is an interesting thing.

If you look at Article 88 when they’re talking about employee data, what Article 88 does is it actually allows member states to provide for more specific rules to ensure that the protections and the freedoms of their data are protected.

These member states may be adding on more laws and more rights than the GDPR already complies! Another thing is, not only do we have to comply with an EU law, but we also are going to comply with member states, other specific laws that may be more narrow than the GDPR.

Employers can’t just look at the GDPR, they’re going to also have to look at if they know where a specific person is. Whether it’s Germany or Poland. They’re going to have to look and see what aspects of the GDPR are there and then what additional, more specific laws that member state may have also put into effect.

Interviewer: Right!

SJ: So, I think that there are two big legal issues hanging out there that U.S. multinational companies…

IOS: One thing that comes to my mind is that there are fines involved when not complying to this. And that includes, of course, doing these DPIAs.

SJ: The fines are significant. I think that’s the easiest way to put it is that the fines are, they’re astronomical, I mean, they’re not fines that we’re used to seeing so there’s two levels of fines depending on the violation. And they can be up to a company’s 4% of their annual global turnover. Or 20 million Euros.  If you’d look at it in U.S. dollar terms, you’re looking at, like, $23 million at this point.

For some companies that could be, that’s a game changer, that’s a company shut down. Some companies can withstand that, but some can’t. And I think any time you’re facing a $23 million penalty, the cost of compliance is probably going to weigh out the potential penalty.

Especially because these aren’t necessarily one-time penalties and there’s nothing that’s going to stop the Data Protection Authority from coming back on you and reviewing again and assessing another penalty if you aren’t in compliance and you’ve already been fined once.
I think the issue is going to be how far the reach is going to be for U.S. companies. I think for U.S. companies that have, you know, brick and mortar operations in a specific member state, I think enforcement is going to  be a lot easier for the DPA.

There’s going be a greater disadvantage to, actually, enforcement for, you know, U.S. companies that only operate in U.S. soil.

Now, if they have employees that are located in the EU, I think that enforcement is going to be a little bit easier, but if they don’t and they’re merely just, you know, attracting business via their website or whatever to EU, I think enforcement is gonna be a little bit more difficult, so it’s going to be interesting to see how enforcement actually plays out.

IOS: Yeah, I think you’re referring to the territorial scope aspects of the GDPR. Which, yeah, I agree that’s kind of interesting.

SJ: I guess my parting advice is this isn’t something that’s easy, it’s something that you do need to speak to an attorney. If you think that it may cover you at all, it’s at least worth a conversation. And I’ve had a lot of those conversations that have lasted, you know, a half an hour, and we’ve been very easily able to determine that GDPR is not going to cover the U.S. entity.

And we don’t have to worry about it. And some we’ve been able to identify that the GDPR is going to touch very slightly and we’re taking eight steps, you know, with the website and, you know, with, you know, on site hard copy documents to make sure that proper consent and notice is given in those documents.

So, sometimes it’s not going be the earth-shattering compliance overhaul of a corporation that you think the GDPR may entail, but it’s worth a call with a GDPR attorney to at least find out so that you can at least sleep better at night because this is a significant regulation, it’s a significant piece of law, and it is going to touch a lot of U.S. operations.

IOS: Right. Well, I want to thank you for talking about this somewhat under-looked area of the GDPR.

SJ: Thank you for having me.

What Experts Are Saying About GDPR

What Experts Are Saying About GDPR

You did get the the memo that GDPR goes into effect next month?

Good! This new EU regulation has a few nuances and uncertainties that will generate more questions than answers over the coming months. Fortunately, we’ve spoken to many attorneys with deep expertise in GDPR. To help you untangle GDPR, the IOS staff reviewed the old transcripts of our conversations, and pulled out a few nuggets that we think will help you get ready.

Does the GDPR cover US businesses? Is the 72-hour breach notification rule strict? Do you need a DPO?  We have the answers below!  If you have more time, listen to our podcasts for deeper insights.

Privacy By Design Raised the Bar

Inside Out Security: Tell us about GDPR, and its implications on Privacy by Design.

Dr. Ann Cavoukian: For the first time, right now the EU has the General Data Protection Regulation, which passed for the first time, ever. It has the words, the actual words, “Privacy by Design” and “Privacy as the default” in the stature.

What I tell people everywhere that I go to speak is that if you follow the principles of Privacy by Design, which in itself raised the bar dramatically from most legislation, you will virtually be assured of complying with your regulations, whatever jurisdiction you’re in.

Because you’re following the highest level of protection. So that’s another attractive feature about Privacy by Design is it offers such a high level of protection that you’re virtually assured of regulatory compliance, whatever jurisdiction you’re in.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


US Businesses Also Need To Prepare for GDPR

Inside Out Security: What are some of the concerns you’re hearing from your clients on GDPR?

Sue Foster: When I speak to my U.S. clients, if they’re a non-resident company that promotes goods or services in the EU, including free services like a free app, for example, they’ll be subject to the GDPR. That’s very clear.

Also, if a non-resident company is monitoring the behavior of people who are located in the EU, including tracking and profiling people based on their internet or device usage, or making automated decisions about people based on their personal data, the company is subject to the GDPR.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Is the 72-hour rule as strict as it sounds?

Inside Out Security:  What we’re hearing from our customers is that the 72-hour breach rule for reporting is a concern. And our customers are confused and after looking at some of the fine print, we are as well!! So I’m wondering if you could explain the breach reporting in terms of thresholds, what needs to happen before a report is made to the DBA’s and consumers?

Sue Foster: So you have to report the breach to the Data Protection Authority as soon as possible, and where feasible, no later than 72 hours after becoming aware of the breach.

How do I know if a breach is likely to ‘result in a risk to the rights and freedoms of natural persons’?

There is actually a document you can look at to tell you what these rights and freedoms are. But you can think of it basically in common sense terms. Are the person’s privacy rights affected, are their rights and the integrity of their communications affected, or is their property affected?

If you decide that you’re not going to report after you go through this full analysis and the DPA disagrees with you, now you’re running the risk of a fine to 2% of the group’s global turnover …or gross revenue around the world.

But for now, and I think for the foreseeable future, it’s going to be about showing your work, making sure you’ve engaged, and that you’ve documented your engagement, so that if something does go wrong, at least you can show what you did.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


What To Do When You Discover A Breach

Inside Out Security: What are one the most important things you would do when you discover a breach? I mean if you could prioritize it in any way. How would you advise a customer about how to have a breach response program in a GDPR context?

Sheila FitzPatrick: Yeah. Well first and foremost, you do need to have in place, before a breach even occurs, an incident response team that’s not made up of just the IT. Because normally organizations have an IT focus. You need to have a response team that includes IT, your chief privacy officer. And if the person… normally a CPO would sit in legal. If he doesn’t sit in legally, you want a legal representative in there as well. You need someone from PR, communications that can actually be the public-facing voice for the company. You need to have someone within Finance and Risk Management that sits on there.

So the first thing to do is to make sure you have that group in place that goes into action immediately. Secondly, you need to determine what data has potentially been breached, even if it hasn’t. Because under GDPR, it’s not… previously it’s been if there’s definitely been a breach that can harm an individual. The definition is if it’s likely to affect an individual. That’s totally different than if the individual could be harmed. So you need to determine okay, what data has been breached, and does it impact an individual?

So, as opposed to if company-related information was breached, there’s a different process you go through. Individual employee or customer data has been breached, the individual, is it likely to affect them? So that’s pretty much anything. That’s a very broad definition. If someone gets a hold of their email address, yes, that could affect them. Someone could email them who is not authorized to email them.

So, you have to launch into that investigation right away and then classify the data that has been any intrusion into the data, what that data is classified as.

Is it personal data?

Is it personal sensitive data?

And then rank it based on is it likely to affect an individual?

Is it likely to impact an individual? Is it likely to harm an individual?

So there could be three levels.

Based on that, what kind of notification? So if it’s likely to affect or impact an individual, you would have to let them know. If it’s likely to harm an individual, you absolutely have to let them know and the data protection authorities know.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Do we need to hire a DPO?

Inside Out Security: An organization must appoint a data protection officer (“DPO”) if, among other things, “the core activities” of the organization require “regular and systematic monitoring of data subjects on a large scale.”  Many Varonis customers are in the B2B space, where they do not directly market to consumers. Their customer lists are perhaps in the tens of thousands of recipients up to the lower six-figure range. First, does the GDPR apply to personal data collected from individuals in a B2B context? And second, how when does data processing become sufficiency “large scale” to require the appointment of a DPO?

Bret Cohen and Sian Rudgard with Hogan Lovells: Yes, the GDPR applies to personal data collected from individuals in a B2B context (e.g., business contacts).  The GDPR’s DPO requirement, however, is not invoked through the maintenance of customer databases.

The DPO requirement is triggered when the core activities of an organization involve regular and systematic monitoring of data subjects on a large scale, or the core activities consist of large scale processing of special categories of data (which includes data relating to health, sex life or sexual orientation, racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, or biometric or genetic data).

“Monitoring” requires an ongoing tracking of the behaviors, personal characteristics, or movements of individuals, such that the controller can ascertain additional details about those individuals that it would not have known through the discrete collection of information.

Therefore, from what we understand of Varonis’ customers’ activities, it is unlikely that a DPO will be required, although this is another area on which we can expect to see guidance from the DPAs, particularly in the European Member States where having a DPO is an existing requirement (such as Germany).

Whether or not a company is required to appoint a DPO, if the company will be subject to the GDPR, it will still need to be able to comply with the “Accountability” record-keeping requirements of the Regulation and demonstrate how it meets the required standards. This will involve designating a responsible person or team to put in place and maintain appropriate  policies and procedures , including data privacy training programs.

 

Top Podcast Episodes

Top Podcast Episodes

Computational Biologist and Founder of Protocols.io, Lenny Teytelman (Part two)

We continue our conversation with Protocols.io founder Lenny Teytelman.In part two of our conversation, we learn more about his company and the use cases that made his company possible. We also learn about the pros and cons of mindless data collection, when data isn’t leading you in the right direction and his experience as a scientist amassing enormous amount of data.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Geneticist and Founder of Protocols.io, Lenny Teytelman (Part one)

Reminder: it’s not “your data”.

It’s the patients’ data
It’s the taxpayers’ data
It’s the funder’s data
—————–
If you’re in industry or self-fund the research & don’t publish, then you have the right not to share your data. Otherwise, it’s not your data.
— Lenny Teytelman (@lteytelman) July 16, 2018

A few months ago, I came across Protocols.io founder Lenny Teytelman’s tweet on data ownership. Since we’re in the business of protecting data, I was curious what inspired Lenny to tweet out his value statement and to also learn how academics and science-based businesses approach data analysis and data ownership. We’re in for a real treat because it’s rare that we get to hear what scientists think about data when in search for discoveries and innovations.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Data & Ethics Expert Dr. Gemma Galdon-Clavell: On the Breach of Trust (Part Two)

Dr. Gemma Galdon-Clavell is a leading expert on the legal, social, and ethical impact of data and data technologies. As founding partner of Eticas Research & Consulting, she traverses in this world every day, working with innovators, businesses, and governments who are are considering the ethical and societal ramifications of implementing new technology in our world.

We continue our discussion with Gemma. In this segment, she points out the significant contribution Volvo made when they opened their seat belt patent. Their aim was to build trust and security with drivers and passengers.

Gemma also points out that we should be mindful of the long-term drawbacks if you ever encounter a data breach or a trust issue – unfortunately, you’re going to lose credibility as well.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Gemma Galdon-Clavell: The Legal, Social, and Ethical Impact of Data and Data Technologies (Part One)

I wanted to better understand how to manage our moral and business dilemmas, so I enlisted data & ethics expert Dr. Gemma Galdon-Clavell to speak about her leadership in this space. As founding partner of Eticas Research & Consulting, she traverses in this world every day, working with innovators, businesses, and governments who are are considering the ethical and societal ramifications of implementing new technology in our world.

In the first part of our interview, Gemma explains why we get ethics fatigue. Unfortunately, those who want to improve our world are consistently told that they’re not doing enough. She also gives us great tips on creating products that have desirability, social acceptability, ethics, and good data management practices.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Allison F. Avery: Diversity and Inclusion

Data breaches keep on happening, information security professionals are in demand more than ever. Did you know that there is currently a shortage of one million infosec pros worldwide? But the solution to this “man-power” shortage may be right in front of and around us. Many believe we can find more qualified workers by investing in Diversity & Inclusion programs.

I wanted to learn more about the benefits of a D&I program, and especially how to create a successful one. So I called Allison F. Avery, Senior Organizational Development & Diversity Excellence Specialist at NYU Langone Medical Center, to get the details from a pro.

She clarified common misconceptions about Diversity & Inclusion (D&I) and offered a framework and methodology to implement D&I. She reminded me, “You should not be doing diversity for diversity sake.”

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Allison F. Avery: How Infosec Can Implement Diversity & Inclusion Programs to Address Workforce Shortage and Make More Money Too

Creating a more diverse workplace isn’t about window dressing. It makes your company more profitable, notes Ed Lazowska, a Professor of Computer Science and Engineering at the University of Washington-Seattle. “Engineering (particularly of software) is a hugely creative endeavor. Greater diversity — more points of view — yields a better result.”

According to research from Center of Talent Innovation, companies with a diverse management and workforce are 45 percent more likely to report growing market share, and 70 percent likelier to report that their companies captured a new market.

I wanted to learn more about the benefits of a D&I program, and especially how to create a successful one. So I called Allison F. Avery, Senior Organizational Development & Diversity Excellence Specialist at NYU Langone Medical Center, to get the details from a pro.

In part one of our interview, Ms. Avery sets the foundation for us by describing what a successful diversity & inclusion program looks like, explaining unconscious bias and her thoughts on hiring based on one’s social network.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Cyber & Tech Attorney Camille Stewart: Discerning One’s Appetite for Risk (Part 2)

We continue our conversation with cyber and tech attorney Camille Stewart on discerning one’s appetite for risk. In other words, how much information are you willing to share online in exchange for something free?

It’s a loaded question and Camille takes us through the lines of questioning one would take when taking a fun quiz or survey online. As always, there are no easy answers or shortcuts to achieving the state of privacy savvy nirvana.

What’s also risky is that we shouldn’t connect laws made in the physical world to cyberspace. Camille warns: if we start making comparisons because at face value, the connection appears to be similar, but in reality isn’t, we may set up ourselves up to truly stifle innovation.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Cyber & Tech Attorney Camille Stewart: The Tension Between Law and Tech (Part 1)

Many want the law to keep pace with technology, but what’s taking so long?

A simple search online and you’ll find a multitude of reasons why the law is slow to catch up with technology – lawyers are risk averse, the legal world is intentionally slow and also late adopters of technology. Can this all be true? Or simply heresy?

I wanted to hear from an expert who has experience in the private and public sector. That’s why I sought out the expertise of Camille Stewart, a cyber and technology attorney.

In part one of our interview, we talk about the tension between law and tech. And as it turns out, laws are built in the same way a lot of technologies are built: in the form of a framework. That way, it leaves room and flexibility so that technology can continue to evolve.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Attorney Sara Jodka on GDPR and Employee Data, Part II

Sara Jodka is an attorney for Columbus-based Dickinson Wright. Her practice covers boths data privacy as well as employee law. She’s in a perfect position to help US companies in understanding how the EU General Data Protection Regulation (GDPR) handles HR data. In the second part of our interview, Sara will talk about the relationship between HR data and Data Protection Impact Assessments (DPIAs). Most companies will likely have to take the extra step and perform these DPIAs but there are specific triggers that Sara will delve into.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Attorney Sara Jodka on GDPR and Employee Data, Part I

Sara Jodka is an attorney for Columbus-based Dickinson Wright. Her practice covers boths data privacy as well as employee law. She’s in a perfect position to help US companies in understanding how the EU General Data Protection Regulation (GDPR) handles HR data. In this first part of the interview, we learn from Sara that some US companies will be in for a surprise when they learn that all the GPDR security rules will apply to internal employee records. The GPDR’s consent requirements, though, are especially tricky for employees.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk, Part Two

In part two of my interview with Varonis CFO & COO Guy Melamed, we get into the specifics with data breaches, breach notification and the stock price.

What’s clear from our conversation is that you can no longer ignore the risks of a potential breach. There are many ways you can reduce risk. However, if you choose not to take action, minimally, at least have a conversation about it.

Also, around 5:11, I asked a question about IT pros who might need some help getting budget. There’s a story that might help.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk, Part One

Recently, the SEC issued guidance on cybersecurity disclosures, requesting public companies to report data security risk and incidents that have a “material impact” for which reasonable investors would want to know about.

How does the latest guidance impact a CFO’s responsibility in preventing data breaches? Luckily, I was able to speak with Varonis’ CFO and COO Guy Melamed on his perspective.

In part one of my interview with Guy, we discuss the role a CFO has in preventing insider threats and cyberattacks and why companies might not take action until they see how vulnerable they are with their own data.

An interview well worth your time, by the end of the podcast, you’ll have a better understanding of what IT pros, finance, legal and HR have on their minds.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Dr. Wolter Pieters on Information Ethics, Part Two

In part two of my interview with Delft University of Technology’sassistant professor of cyber risk, Dr. Wolter Pieters, we continue our discussion on transparency versus secrecy in security.

We also cover ways organizations can present themselves as trustworthy. How? Be very clear about managing expectations. Declare your principles so that end users can trust that you’ll be executing by the principles you advocate. Lastly, have a plan for know what to do when something goes wrong.

And of course there’s a caveat, Wolter reminds us that there’s also a very important place in this world for ethical hackers. Why? Not all security issues can be solved during the design stage.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Dr. Wolter Pieters on Information Ethics, Part One

In part one of my interview with Delft University of Technology’s assistant professor of cyber risk, Dr. Wolter Pieters, we learn about the fundamentals of ethics as it relates to new technology, starting with the trolley problem. A thought experiment on ethics, it’s an important lesson in the world of self-driving cars and the course of action the computer on wheels would have to take when faced with potential life threatening consequences.

Wolter also takes us through a thought track on the potential of power imbalances when some stakeholders have a lot more access to information than others. That led us to think, is technology morally neutral? Where and when does one’s duty to prevent misuse begin and end?

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Privacy Attorney Tiffany Li and AI Memory, Part II

Tiffany C. Li is an attorney and Resident Fellow at Yale Law School’s Information Society Project. She frequently writes and speaks on the privacy implications of artificial intelligence, virtual reality, and other technologies. Our discussion is based on her recent paper on the difficulties with getting AI to forget. In this second part, we continue our discussion of GDPR and privacy, and then explore some cutting edge areas of law and technology. Can AI algorithms own their creative efforts? Listen and learn.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Privacy Attorney Tiffany Li and AI Memory, Part I

Tiffany C. Li is an attorney and Resident Fellow at Yale Law School’s Information Society Project. She frequently writes and speaks on the privacy implications of artificial intelligence, virtual reality, and other technologies. Our discussion is based on her recent paper on the difficulties with getting AI to forget. In this first part , we talk about the GDPR’s “right to be forgotten” rule and the gap between technology and the law.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Rita Gurevich, CEO of SPHERE Technology Solutions

Long before cybersecurity and data breaches became mainstream, founder and CEO of SPHERE Technology Solutions, Rita Gurevich built a thriving business on the premise of assisting organizations secure their most sensitive data from within, instead of securing the perimeter from outside attackers.

And because of her multi-faceted experiences interacting with the C-Suite, technology vendors, and others in the business community, we thought listening to her singular perspective would be well worth our time.

What stood out in our podcast interview? When others are concerned about limited security budgets, Gurevich envisioned more hands on deck in the field of information security. The reason is that there are more and varied threats, oversaturated vendors in the marketplace, and a cybersecurity workforce shortage.

“What I see happening is that there’s going to be subject matter CISOs across the company; where there will be many people with that title that become experts in very specific domains.”

Also, now that cybersecurity concerns are not as industry specific, Gurevich does recognize that there are certain industries that are more at risk than others.

She approaches all industries with varying degrees of risk and threats, compliance requirements, and disparate systems all in a strategic way – by giving organizations the visibility into their data and systems, what they need to protect and how they need to protect it.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Penetration Testers Sanjiv Kawa and Tom Porter

While some regard Infosec as compliance rather than security, veteran pentesters Sanjiv Kawa and Tom Porter believe otherwise. They have deep expertise working with large enterprise networks, exploit development, defensive analytics and I was lucky enough to speak with them about the fascinating world of pentesting.

In our podcast interview, we learned what a pentesting engagement entails, assigning budget to risk, the importance of asset identification, and so much more.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Dr. Tyrone Grandison on Data, Privacy and Security

Dr. Tyrone Grandison has done it all. He is an author, professor, mentor, board member, and a former White House Presidential Innovation Fellow. He has held various positions in the C-Suite, including his most recent role as Chief Information Officer at the Institute of Health Metrics and Evaluation, an independent health research center that provides metrics on the world’s most important health problems.

In our interview, Tyrone shares what it’s like to lead a team of forty highly skilled technologists who provide tools, infrastructure, and technology to enable researchers develop statistical models, visualizations and reports. He also describes his adventures on wrangle petabytes of data, the promise and peril of our data economy, and what board members need to know about cybersecurity.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Dr. Zinaida Benenson and Phishing, Part II

Dr. Zinaida Benenson is a researcher at the University of Erlangen-Nuremberg, where she heads the “Human Factors in Security and Privacy” group. She and her colleagues conducted a fascinating study into why people click on what appears to be obvious email spam. In the second part of our interview, Benenson offers very practical advice on dealing with employee phishing and also discusses some of the consequences of IoT hacking.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Dr. Zinaida Benenson and Phishing, Part I

Zinaida Benenson is a researcher at the University of Erlangen-Nuremberg, where she heads the “Human Factors in Security and Privacy” group. She and her colleagues conducted a fascinating study into why people click on what appears to be obvious email spam. In the first part of our interview with Benenson, we discusses how she collected her results, and why curiosity seems to override security concerns when dealing with phish mail.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Roxy Dee, Threat Intelligence Engineer

Some of you might be familiar with Roxy Dee’s infosec book giveaways. Others might have met her recently at Defcon as she shared with infosec n00bs practical career advice. But aside from all the free books and advice, she also has an inspiring personal and professional story to share.

In our interview, I learned about her budding interest in security, but lacked the funds to pursue her passion. How did she workaround her financial constraint? Free videos and notes with Professor Messer! What’s more, she thrived in her first post providing tech support for Verizon Fios. With grit, discipline and volunteering at BSides, she eventually landed an entry-level position as a network security analyst.

Now she works as a threat intelligence engineer and in her spare time, she writes how-tos and shares sage advice on her Medium account, @theroxyd

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Attorney and GDPR Expert Sue Foster, Part 2

Sue Foster is a London-based partner at Mintz Levin. In the second part of the interview, she discusses the interesting loophole for ransomware breach reporting requirements that’s currently in the GDPR However, there’s another EU regulation going into effect in May of 2018, the NIS Directive, which would make ransomware reportable. And Foster talks about the interesting implications of IOT devices in terms of the GDPR. Is the data collected by your internet-connected refrigerator or coffee pot considered personal data under the GDPR? Foster says it is!

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Attorney and GDPR Expert Sue Foster, Part 1

Sue Foster is a London-based partner at Mintz Levin. She has a gift for explaining the subtleties in the EU General Data Protection Regulation (GDPR). In this first part of the interview, she discusses how US companies can get caught up in either the GDPR’s extraterritoriality rule or the e-Privacy Directive’s new language on embedded communication. She also decodes the new breach notification rules, and when you need to report to the DPA and consumers. Privacy and IT security pros should find her discussion particularly relevant.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Troy Hunt and Lessons from a Billion Breached Data Records

Troy Hunt is a web security guru, Microsoft Regional Director, and author whose security work has appeared in Forbes, Time Magazine and Mashable. He’s also the creator of “Have I been pwned?”, the free online service for breach monitoring and notifications.

In this podcast, we discuss the challenges of the industry, learn about his perspective on privacy and revisit his talk from RSA, Lessons from a Billion Breached Data Records as well as a more recent talk, The Responsibility of Disclosure: Playing Nice and Staying Out of Prison.

After the podcast, you might want to check out the free 7-part video course we developed with Troy on the new European General Data Protection Regulation that will be law on May 25, 2018 – changing the landscape of regulated data protection law and the way that companies collect personal data. Pro tip: GDPR will also impact companies outside the EU.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


John P. Carlin: Emerging Threats (Part 4)

In this concluding post of John Carlin’s Lessons from the DOJ, we cover a few emerging threats: cyber as an entry point, hacking for hire and cybersecurity in the IoT era.

One of the most notable anecdotes are John’s descriptions of how easy it was to find hacking for hire shops on the dark web. Reviews of the most usable usernames and passwords and most destructive botnets are widely available to shoppers. Also, expect things to get worse before they get better. With the volume of IoT devices now available developed without security by design, we’ll need to find a way to mitigate the risks.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


John P. Carlin: Ransomware & Insider Threat (Part 3)

We continue with our series with John Carlin, former Assistant Attorney General for the U.S. Department of Justice’s National Security Division. This week, we tackle ransomware and insider threat.

According to John, ransomware continues to grow, with no signs of slowing down. Not to mention, it is a vastly underreported problem. He also addressed the confusion on whether or not one should engage law enforcement or pay the ransom. And even though recently the focus has been on ransomware as an outside threat, let’s not forget insider threat because an insider can potentially do even more damage.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


John P. Carlin: Economic Espionage & Weaponized Information (Part 2)

In part two of our series, John Carlin shared with us lessons on economic espionage and weaponized information.

As former Assistant Attorney General for the U.S. Department of Justice’s National Security Division, he described how nation state actors exfiltrated data from American companies, costing them hundreds of billions of dollars in losses and more than two million jobs.

He also reminded us how important it is for organizations to work with the government as he took us down memory lane with the Sony hack. He explained how destructive an attack can be, by using soft targets, such as email that do not require sophisticated techniques.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


John P. Carlin: Lessons Learned from the DOJ (Part 1)

John P. Carlin, former Assistant Attorney General for the U.S. Department of Justice’s (DOJ) National Security Division, spent an afternoon sharing lessons learned from the DOJ.

And because the lessons have been so insightful, we’ll be rebroadcast his talk as podcasts.

In part one of our series, John weaves in lessons learned from Ardit Ferizi, Hacktivists/Wikileaks, Russia, and the Syrian Electronic Army. He reminds us that the current threat landscape is no doubt complicated, requiring blended defenses, as well as the significance of collaboration between businesses and law enforcement.

John Carlin currently chairs Morrison & Foerster’s global risk and crisis management team.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Christina Morillo, Enterprise Information Security Expert

If you want to be an infosec guru, there are no shortcuts to the top. And enterprise information security expert, Christina Morillo knows exactly what that means.

When she worked at the help desk, she explained technical jargon to non-technical users. As a system administrator, Christina organized and managed AD, met compliance regulations, and completed entitlement reviews. Also, as a security architect, she developed a comprehensive enterprise information security program. And if you need someone to successfully manage an organization’s risk, Christina can do that as well.

In our interview, Christina Morillo revealed the technical certificates that helped jumpstart her infosec career, described work highlights, and shared her efforts in bringing a more accurate representation of women of color in tech through stock images.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Scout Brody, PhD: Design Thinking and IoT

By now, we’ve all seen the wildly popular internet of things devices flourish in pop culture, holding much promise and potential for improving our lives. One aspect that we haven’t seen are IoT devices that not connected to the internet.

In our follow-up discussion, this was the vision Simply Secure’s executive director Scout Brody advocates, as current IoT devices don’t have a strong foundation in security.

She points out that we should consider why putting a full internet stack on a new IoT device will help users as well as the benefits of bringing design thinking when creating IoT devices.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Scout Brody, Ph.D. on Creating Security Systems Usable for All

With the spring just a few short weeks away, it’s a good time to clean the bedroom windows, dust off the ceiling fans, and discard old security notions that have been taking up valuable mind space.

What do you replace those security concepts with?

How about ones that say that security systems are not binary “on-off” concepts, but instead can be seen as a gentle gradient. And where user experiences developed by researchers create security products that actually, um, work. This new world is conceived by Scout Brody, executive director of Simply Secure, a nonprofit dedicated to leveraging user interface design to make security easier and more intuitive to use.

“UX design is a critical part of any system, including security systems that are only meant to be used by highly technical expert users,” according to Brody. “ So if you have a system that helps monitor network traffic, if it’s not usable by the people who are designed to use it or it’s designed for, then it’s not actually going to help them do their jobs.”

In the first part of my interview with Scout Brody, we cover why security systems aren’t binary, the value of user interface designers, and how to cross pollinate user personas with threat models.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Professor Angela Sasse on the Economics of Security

In part two of my interview with Angela Sasse, Professor of Human-Centred Technology, she shared an engagement she had with British Telecom(BT).

The accountants at BT said that users were resetting passwords at a rate that overwhelmed the helpdesk’s resources, making the cost untenable. The security team believed that the employees were the problem, meanwhile Sasse and her team thought otherwise. She likened the problem of requiring users to remember their passwords to memory exercises. And with Sasse’s help, they worked together to change the security policy that worked for both the company and the user.

We also covered the complexities of choosing the right form of authentication (i.e. passwords, 2FA or biometrics?), the pros and cons of user training, and the importance of listening to your users.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Professor Angela Sasse FREng on Human-Centered Security

Lately, we’ve been hearing more from security experts who are urging IT pros to stop scapegoating users as the primary reason for not achieving security nirvana. After covering this controversy on a recent episode of the Inside Out Security Show, I thought it was worth having an in-depth conversation with an expert.

So, I contacted Angela Sasse, Professor of Human-Centred Technology in the Department of Computer Science at University College London, UK. Over the past 15 years, she has been researching the human-centered aspects of security, privacy, identity and trust. In 2015, for her innovative work, she was awarded the Fellowship of the Royal Academy of Engineering(FREng) for being one of the best and brightest engineer and technologist in the UK.

In part one of my interview with Professor Angela Sasse, we cover the challenges that CISOs have in managing risk while finding a way to understand what’s being asked of the user. And more importantly, why improving the usability of security can positively impact an organization’s profits.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Medical Privacy Expert Adam Tanner

Adam Tanner is the author of “Our Bodies, Our Data”, which tells the story of a hidden dark market in drug prescription and other medical data. In recent years hackers have been able to steal health data on a massive scale — remember Anthem? In this second part of our interview, we explore the implications of hacked medical data. If hackers get into a data brokers’ drug databases and combine with previously stolen medical insurance records, will they rule the world?

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Medical Privacy Expert Adam Tanner

Adam Tanner is the author of “Our Bodies, Our Data”, which tells the story of a hidden dark market in drug prescription and other medical data. Adam explains how the sale of “anonymized” data is a multi-billion dollar business not covered by HIPPA rules. In this first part of our interview, we learn from Adam how the medical data brokers got started and why it’s legal.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


More Ann Cavoukian: GDPR and Access Control

We continue our discussion with Dr. Ann Cavoukian. She is currently Executive Director of Ryerson University’s Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design (PbD).

In this segment, Cavoukian tells us that once you’ve involved your customers in the decision making process, “You won’t believe the buy-in you will get under those conditions because then you’ve established trust and that you’re serious about their privacy.”

We also made time to cover General Data Protection Regulation (GDPR) as well as three things organizations can do to demonstrate that they are serious about privacy.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Dr. Ann Cavoukian: Privacy by Design

I recently had the chance to speak with former Ontario Information and Privacy Commissioner Dr. Ann Cavoukian about big data and privacy. Dr. Cavoukian is currently Executive Director of Ryerson University’s Privacy and Big Data Institute and is best known for her leadership in the development of Privacy by Design (PbD).

What’s more, she came up with PbD language that made its way into the GDPR, which will go into effect in 2018. First developed in the 1990s, PbD addresses the growing privacy concerns brought upon by big data and IoT devices.

Many worry about PbD’s interference with innovation and businesses, but that’s not the case.

When working with government agencies and organizations, Dr. Cavoukian’s singular approach is that big data and privacy can operate together seamlessly. At the core, her message is this: you can simultaneously collect data and protect customer privacy.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Password Expert Per Thorsheim on Biometrics and Keystroke Dynamics

Per explains, “Keystroke dynamics, researchers have been looking at this for many, many years. It’s still an evolving piece of science. But it’s being used in real life scenarios with banks. I know at least there’s one online training company in the US that’s already using keystroke dynamics to verify if the correct person is doing the online exam. What they do is measure how you type on a keyboard. And they measure the time between every single keystroke, when you are writing in password or a given sentence. And they also look for how long you keep a button pressed and a few other parameters.”

What’s even more surprising is that it is possible to identify one’s gender using keystroke dynamics. Per says, “With 7, 8, 9 keystrokes, they would have a certainty in the area of 70% or more…and the more you type, if you go up to 10, 11, 12, 15 characters, they would have even more data to figure out if you were male or female.”

Those who don’t want to be profiled by their typing gait can try Per Thorshim’s and another infosec expert Paul Moore’s Keyboard Privacy extension.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Password expert Per Thorsheim On Life After Two Factor Authentication

Based in Norway, Per Thorsheim is an independent security adviser for organizations and government. He is also the founder of PasswordsCon.org, a conference that’s all about passwords, PIN codes, and authentication. Launched in 2010, the conference is a gathering security professionals & academic researchers worldwide to better understand and improve security worldwide.

In part one of our conversation, Per explains – despite the risks – why we continue to use passwords, the difference between 2-factor authentication and 2-step verification, as well as the pros and cons of using OAuth.

Naturally the issue of privacy comes up when we discuss connected accounts with OAuth. So we also made time to cover Privacy by Design as well as the upcoming EU General Data Protection Regulation(GDPR).

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Security Expert and “Hacked Again” Author Scott Schober”

We continue our discussion with Scott. In this segment, he talks about the importance of layers of security to reduce the risks of an attack. Scott also points out that we should be careful about revealing personal information online. It’s a lesson he learned directly from legendary hacker Kevin Mitnick!

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Security Expert and “Hacked Again” Author Scott Schober”

Scott Schober wears many hats. He’s an inventor, software engineer, and runs his own wireless security company. He’s also written Hacked Again, which tells about his long running battle against cyber thieves. Scott has appeared on Bloomberg TV, Good Morning America, CNBC, and CNN.

In the first part of the interview, Scott tells us about some of his adventures in data security. He’s been a victim of fraudulent bank transfers and credit card transaction. He’s also aroused the wrath of cyber gangs and his company’s site was a DDoS target. There are some great security tips here for both small businesses and consumers.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


More Sheila FitzPatrick: Data Privacy and the Law

In the next part of our discussion, data privacy attorney Sheila FitzPatrick get into the weeds and talks to us about her work in setting up Binding Corporate Rules (BCR) for multinational companies. These are actually the toughest rules of the road for data privacy and security.

What are BCRs?

They allow companies to internally transfer EU personal data to any of their locations in the world. The BCR agreement has to get approval from a lead national data protection authority (DPA) in the EU. FitzPatrick calls them a gold standard in compliance—they’re tough, comprehensive rules with a clear complaint process for data subjects.

Another wonky area of EU compliance law she has worked on is agreements for external transfer data between companies and third-party data processors. Note: it gets even trickier when dealing with cloud providers.

This is a fascinating discussion from a working data privacy lawyer.

And it’s great background for IT managers who need to keep up with the lawyerly jargon while working with privacy and legal officers in their company!

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Sheila FitzPatrick and GDPR

We had a unique opportunity in talking with data privacy attorney Sheila FitzPatrick. She lives and breathes data security and is a recognized expert on EU and other international data protection laws. FitzPatrick has direct experience in working with and representing companies in front of EU data protection authorities (DPAs) and sits on various governmental data privacy advisory boards.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


IoT Pen Tester Ken Munro: Probing Wireless Networks

We have more Ken Munro in this second part of our podcast. In this segment, Ken tells us how he probes wireless networks for weaknesses and some of the tools he uses.

One takeaway for me is that the PSKs or passwords for WiFi networks should be quite complex, probably at least 12 characters. The hackers can crack hashes of low-entropy WiFi keys, which they can scoop up with wireless scanners.

Ken also some thoughts on why consumer IoT devices will continue to be hackable. Keep in mind that his comments on security and better authentication carry over quite nicely to the enterprise world.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


IoT Pen Tester Ken Munro: Security Holes

If you want to understand the ways of a pen tester, Ken Munro is a good person to listen to. An info security veteran for over 15 years and founder of UK-based Pen Test Partners, his work in hacking into consumer devices — particularly coffee makers — has earned lots of respect from vendors. He’s also been featured on the BBC News.

You quickly learn from Ken that pen testers, besides having amazing technical skills, are at heart excellent researchers.

They thoroughly read the device documentation and examine firmware and coding like a good QA tester. You begin to wonder why tech companies, particularly the ones making IoT gadgets, don’t run their devices past him first!

There is a reason.

According to Ken, when you’re small company under pressure to get product out, especially IoT things, you end up sacrificing security. It’s just the current economics of startups. This approach may not have been a problem in the past, but in the age of hacker ecosystems, and public tools such as wigle.net, you’re asking for trouble.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Chief Data Officer Richard Wendell: Skills to Cultivate

In this second podcast, Mr. Wendell continues where he left off last time.

He explains the skills you’ll need in order to be an effective Chief Data Officer and we learn more about MIT’s International Society of Chief Data Officers.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Richard Wendell: Information as an Asset

The emergence of Chief Data Officers(CDO) demonstrates the growing recognition of information as an asset. In fact, Gartner says that 90% of large organizations will have a CDO by 2019.

To understand the CDO role more deeply, I turned to Richard Wendell.

I met Mr. Wendell last year at the Chief Data Officer Summit and thought his background and expertise would help us understand the critical role a CDO plays in managing an organization’s data.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Attorney and Data Scientist Bennett Borden: Find Insider Threats

In this second podcast, Bennett continues where he left off last time. Borden describes his work on developing algorithms to find insider threats based on analyzing content and metadata.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Attorney and Data Scientist Bennett Borden: Data Analysis Techniques

Once we heard Bennett Borden, a partner at the Washington law firm of DrinkerBiddle, speak at the CDO Summit about data science, privacy, and metadata, we knew we had to reengage him to continue the conversation.

His bio is quite interesting: in addition to being a litigator, he’s also a data scientist. He’s a sought after speaker on legal tech issues. Bennett has written law journal articles about the application of machine learning and document analysis to ediscovery and other legal transactions.

In this first part in a series of podcasts, Bennett discusses the discovery process and how data analysis techniques came to be used by the legal world. His unique insights on the value of the file system as a knowledge asset as well as his perspective as an attorney made for a really interesting discussion.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Statistician Kaiser Fung: Accuracy of Algorithms

In part oneof our interview with Kaiser, he taught us the importance of looking at the process behind a numerical finding.

We continue the conversation by discussing the accuracy of statistics and algorithms. With examples such as shoe recommendations and movie ratings, you’ll learn where algorithms fall short.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


Statistician Kaiser Fung: Investigate The Process Behind A Numerical Finding

In the business world, if we’re looking for actionable insights, many think it’s found using an algorithm.

However, statistician Kaiser Fung disagrees. With degrees in engineering, statistics, and an MBA from Harvard, Fung believes that both algorithms and humans are needed, as the sum is greater than its individual parts.

Moreover, the worldview he suggests one should cultivate is numbersense. How? When presented with a numerical finding, go the extra mile and investigate the methodology, biases, and sources.

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


New Survey Reveals GDPR Readiness Gap

New Survey Reveals GDPR Readiness Gap

With just a few months left to go until the EU General Data Protection Regulation (GDPR) implementation deadline on May 25, 2018, we commissioned an independent survey exploring the readiness and attitudes of security professionals toward the upcoming standard.

The survey, Countdown to GDPR: Challenges and Concerns, which polled security professionals in the UK, Germany, France and U.S., highlights surprising GDPR readiness shortcomings, with more than half (57%) of professionals still concerned about compliance.

Findings include:

  • 56% think the right to erasure/”to be forgotten” poses the greatest challenge in meeting the GDPR, followed by implementing data protection by design.
  • 38% of respondents report that their organizations do not view compliance with GDPR by the deadline as a priority.
  • 74% believe that adhering to the GDPR will give them a competitive advantage over other organizations in their sector.