Tag Archives: podcast

[Podcast] Cyber & Tech Attorney Camille Stewart: Discerning One’...

[Podcast] Cyber & Tech Attorney Camille Stewart: Discerning One’s Appetite for Risk


Leave a review for our podcast & we'll send you a pack of infosec cards.

We continue our conversation with cyber and tech attorney Camille Stewart on discerning one’s appetite for risk. In other words, how much information are you willing to share online in exchange for something free?

It’s a loaded question and Camille takes us through the lines of questioning one would take when taking a fun quiz or survey online. As always, there are no easy answers or shortcuts to achieving the state of privacy savvy nirvana.

What’s also risky is that we shouldn’t connect laws made in the physical world to cyberspace. Camille warns: if we start making comparisons because at face value, the connection appears to be similar, but in reality isn’t, we may set up ourselves up to truly stifle innovation.

Choosing Convenience over Privacy

Camille Stewart

Hi, I’m Camille Stewart. I’m a cyber and technology attorney. I am currently at Deloitte working on cyber risk and innovation issues, so identifying emerging technologies for the firm to work with. Prior to that, I was a senior policy advisor at the Department of Homeland Security working on cyber infrastructure regarding to foreign policy in the Office of Policy. I was an appointee in the Obama Administration. And then prior to that I was in-house at a cybersecurity company. So I’ve worked in both the public sector and the private sector on cyber issues.

Cindy Ng

Thanks, Camille. Can you talk a little bit about privacy conceptually? Everybody wants privacy, it seems like a good thing, but why aren’t people picking privacy over convenience? Convenience, yes, it’s easy but what about privacy is not getting through to people?

Camille Stewart

I don’t think people are looking at the long-term ramifications, right? I know very recently we had the genetic testing case that helped lead to a killer, which is wonderful in that specific instance. But I doubt that anybody who sends in their genetic information, had it tested and figured out their heritage has thought about how that data might be used otherwise, has read the disclaimer that tells you how your data will be used whether it’s for research, whether it will be used by the police, whether it will be used to create new things.

And if anybody remembers Henrietta Lacks, her data was used to create all of these things that are very wonderful but she never got any compensation for it. Not knowing how your information is used takes away all of your control, right? And a world where your data is commoditized and it has a value, you should be in control of the value of your data. And whether it’s as simple as we’re giving away our right to choose how and when we disburse our information and/or privacy that leads us to security implications, those things are important.

For example, you don’t care that there’s information pooled and aggregated from a number of different places about you because you’ve posted it freely or because you traded it for a service that’s very convenient until the moment when you realize that because you took the quiz and let this information out or because you didn’t care that your address was posted on like a Spokeo site or something else, you didn’t realize that all of the questions to your banking security information are now all easily searched on the internet and probably being aggregated by some random organization. So somebody could easily take and say, “Oh, what’s your mother’s maiden name? Okay. And what city do you live in? Okay. And what high school did you go to? Okay.”

And those are three pieces of information that maybe you didn’t post in the same place but you posted and didn’t care because you traded it for something or you posted it and you didn’t think it through and now they can aggregate it because you use those two things for everything and now someone has access to your bank account, they’ve got access to your email, they’ve got access to all of these things that are really important to you and your privacy has now translated into your security.

Cindy Ng

I was just talking to my coworkers about this that it doesn’t come naturally to know not to answer these questions because you can online somewhere and let’s say you’re a part of a community you trust and you answer these innocuous questions and then you won’t necessarily have the foresight to know that it’s gonna come back and hurt you. How did you come up with the reasoning behind, “Oh, I probably shouldn’t answer those questions?” Because you kinda have to be a little skillful and have a bit of foresight or some knowledge to even think in the way that you do.

Camille Stewart

No, you’re right, there is a level of savvy that has to happen for you to think that way and a level of, like you said, foresight or a level of reaction, right? Most people aren’t thinking that way because they knew it before it happened but now that the information’s out there, they’re taking action. And I think there are a lot of people who are neglecting that.

So we all, just like organizations, just have to press it, have to make this vision become their appetite for risk. We as individuals have to do the same. And so if you are willing to risk because you think either, “They won’t look for me,” or, “I’m willing to take the hits because my bank will reimburse me,” or whatever the decision which you are making, I want you to be informed.

I’m not telling you what your risk calculus is but I wanna encourage people to understand how information can be used, understand what they’re putting out there and make decisions accordingly. So your answer to that might be like “Look, I don’t wanna give up taking Facebook for this or sharing information in a community that I trust on some social site but what I will do is have a set of answers that I don’t share with anyone to those normal questions that they use for password reset that are wrong but only I know the fake answers that I’m using for them.”

So instead of your actual mother’s maiden name, you’re using something else and you’ve decided that that’s one of the ways that you will protect yourself because you really wanna still use these other tools and that might be the way you protect yourself. So I challenge people not to give up the things that they love, like I mean, I would assess whether or not certain things are worth the risk, right?

Like a quiz on Facebook that makes you provide data to an external third party that you’re not really sure of how they’re using it, not likely worth it. But the quizzes where you can just kinda take them, that might be worth it. I mean, the answers you provide for those questions still are revealing about you but maybe not in a way that’s super impactful. Maybe in a way that’s likely just for marketing and if you’re okay with that, then take that or you go resilient the other way.

Artificial Intelligence and Legal Protections

Cindy Ng

I wanna talk about an article that an attorney wrote, Tiffany Li, she wrote about how AI will someday eclipse the intelligence of the human and whether or not AI will have legal protections and then she juxtaposed it with the case with the monkey and how a monkey took a photographer’s camera and took a selfie and there was a lawsuit with how we can use the monkey’s lawsuit as precedent for future cases such as AI and recently, the monkey lost the lawsuit. Not the monkey but PETA. I just wanna hear from your perspective, as a lawyer, how to think about it moving forward.

Camille Stewart

I mean, it remains to be seen how things like AI will translate, especially in terms of creative spaces. It will be hard to determine ownership if a machine creates a work. And I mean, they’ll come down to a final decision. We’ll have to decide that things that are created by a machine and solely by a machine, right, like if there are human’s input we might make one decision versus if it’s solely created by a machine, we might say that that is in the public sphere and anybody can use it and is not as anything that has any kinda attributable protection.

Versus if there is human input, we would decide that that is something that they can then own the production of, right, because they contributed to the making of whatever the end product is. It’s hard to speculate but there will have to be a line drawn and it’s likely somewhere in there, right? The sense that there is enough human interjection, whether that is from the input from whatever creative process is happening by the machine or in the creation of the process or program or software that is being used and then spit out some creation on the end, there will have to be a law or I guess at least case law that kinda dictates where that line is drawn.

But those will be the things that’s fun, right? Tiffany, and other lawyers like myself, I think those are the things that we enjoy most about the space is that stuff is unclear. And as these things roll out you get to make connections with the monkey case and AI and with other things that have already happened and new processes, new tech, new innovations and try to help draw those lines.

Cindy Ng

Is there anything we need to look out for that we’re not aware of? Or certain connections that are sorta in the legal space that people in the tech space aren’t aware of?

Camille Stewart

So I was gonna say, I don’t actually think it is safe to on a broad scale without some level of assessment, connect laws made in accordance with the physical world to cyberspace, I think it’s dangerous, because usually they’re not one for one. It is the place where most people start because it’s the easiest proposition to compare something that we’ve seen before with something in cyber. But they don’t always compare or don’t always compare in the way that we would think that they would.

And so it’s dangerous to make those comparisons without some level of assessment. And so I would tell people to challenge those assessments when you hear them and try to poke holes in them, because bad facts make for bad law. And if we take the easy route and just start making comparisons because on their face they seem similar, we may set up ourselves up to truly stifle innovation, which is exactly what we’re trying to prevent.

Cindy Ng

Can you provide us with an example of why it’s dangerous, because it feels like the natural thing to do?

Camille Stewart

No, you’re right, it does feel natural. I’m trying to think of something…I’m thinking more along the lines of likening something physical to something cyber. So let’s think about borders, right? So borders in a physical sense are very clear limitations of authority and operation. You can’t cross a physical border without being able to use a passport, a Visa, things like that and they can control physical entry and exit at a border, a different country can.

That is not the same as cyber-based. And to liken the two in the way that you use rules is not smart, right? It’s your first inclination to wanna try to stop data flow at the edge of a country, at the edge of some imaginary border, but it is not realistic because the internet by its very nature is global and interconnected and, you know, traverses the world freely and you can’t really stop things on that line, which is why things like GDPR are important for organizations across the world because as a company that has a global reach because you’re on the internet, you will be affected by how laws are created in different localities.

So that’s a very big example but it happens in very discreet ways too when it comes to technology, cyberspace, and physical laws. Or the physical space and laws that are operated in that way and so I would challenge people that when you hear people make a one for one connection very easily without some level of assessment to try to question that to make sure it really is the best way to adapt some things to the given situation.

The reason for example, Tiffany’s likening of AI to this monkey case, it’s an easy connection to make because in your head you think, “Well, the monkey is not human, they made a thing, and if they can’t own the thing then when you do that online and a machine makes a thing, they can’t own a thing.” But it very well may not be the same analysis that needs to be made in setting, right? The lines may become very different because none of us could create a monkey. So if I can’t create a monkey, then it’s harder to control the output of that monkey. But I could very well create a machine that could then create an output and shouldn’t I be the owner of that output if I created the machine that then created the output?

Cindy Ng


Camille Stewart

But that was my point is that likening things that on their face being the same, the lines therein might be different or they just might be different altogether because cyberspace and the physical space are not a one for one.

[Podcast] Cyber & Tech Attorney Camille Stewart: The Tension Between L...

[Podcast] Cyber & Tech Attorney Camille Stewart: The Tension Between Law and Tech


Leave a review for our podcast & we'll send you a pack of infosec cards.

Many want the law to keep pace with technology, but what’s taking so long?

A simple search online and you’ll find a multitude of reasons why the law is slow to catch up with technology – lawyers are risk averse, the legal world is intentionally slow and also late adopters of technology. Can this all be true? Or simply heresy?

I wanted to hear from an expert who has experience in the private and public sector. That’s why I sought out the expertise of Camille Stewart, a cyber and technology attorney.

In part one of our interview, we talk about the tension between law and tech. And as it turns out, laws are built in the same way a lot of technologies are built: in the form of a framework. That way, it leaves room and flexibility so that technology can continue to evolve.

Frameworks Reign in Law and Tech

Camille Stewart

Hi, I am Camille Stewart. I’m a cyber and technology attorney. I’m currently at Deloitte working on cyber risk and innovation issues, so identifying emerging technologies for the firm to work with. Prior to that, I was a Senior Policy Advisor at the Department of Homeland Security working on cyber infrastructure, and foreign policy in the office of policy. I was an appointee of the Obama administration. And then prior to that, I was in-house at a cybersecurity company. I worked in both the public sector and the private sector on cyber issues.

Cindy Ng

Today, we’re gonna be talking about the tension between law and technology, where a law takes a lot of time and inquiry to create something that makes sense and hopefully is impactful for years to come, whereas technology, it’s really about ideation and creating and bringing product and service to market as quickly as possible.

Tech people, they want law to catch up with technology. Lawyers wished tech people would understand the law a little bit more. And some have even criticized that the law doesn’t move as quickly as technology, and you have a lot of experience both as a cybersecurity attorney in Washington and in the private sector.

And I’m wondering if there’s a deeper divide between the two entities, and I’m wondering if you can share your experience with us in working with lawmakers as well as your experience in the private sector.

Camille Stewart

Yeah, so, I mean, I think one misconception is you don’t want the law to keep pace with innovation. There’s no way for you to legislate for future occurrences and for the ideation and innovation we’ve talked about.

You want the law to leave room and flexibility so that technology can continue to evolve. And so that’s kind of what has to happen. It’s frustrating that there are no legal recourses when an issue comes up, but you almost have to test those boundaries to figure out a framework to fit your bill to address issues that are coming.

So even the laws that we do build tend to be framework because we need to leave room for that innovation and ideation. And part of the tension between technology communities and lawyers and technology communities and the general public or the government is trust. So technologists don’t trust the government with the information that they have, and the government wants to build that trust desperately so that we can leverage the resources that are at the disposal of both.

You know, the government has a lot of insight and intelligence that they can layer over the tools and capabilities in the private sector, and if they came together, it’s great, but there’s this base level of trust and understanding of what each is trying to do that if we could bridge that gap, so much more could be done.

Cindy Ng

Is there a think tank or a non-profit or some kind of institution that can bridge that gap that you’ve seen develop over the past few years?

Camille Stewart

Yeah, so there are a number that are working on this, whether it’s issue-specific, right, “So let’s talk about surveillance and bringing people together around that.” “Let’s talk about a given issue and discuss that.” Also the government is trying that.

Organizations like DHS that work with the private sector quite a bit are trying to build those bridges and find ways to share information in a way that’s valuable to both the private sector and the government through things like AIS, the Automated Indicator Sharing system. And it’s gonna be a slow process.

Those trusts are bolted tight.

Private sector has coalesced together to build trust circles with their peers and people that they know doing work that they understand, and they’re sharing information that way. And those mechanisms have become pretty robust and helpful, but the government has to be able to be a part of that for us to really complete the picture, and that’s the work that’s being done, some through non-profit organizations, NGOs, but also through the government and the private sector starting to get into a room.

And then, as people move back and forth across lines, right, traditionally people were govies for life, or they were in the private sector. Now there’s more movement back and forth, and that’ll help build the trust as well.

Bridging the Gap between Law and Tech

Cindy Ng

What would you say to lawyers who need to understand technology and technologists that need to understand the law?

Camille Stewart

I would say at a base level, do the work to understand the content. Lawyers need to take the time to understand the technology, to ask the questions, understand what the end goal is, and understanding what the technologist is building and for what end user. And the nice thing is that a lawyer is likely the end user of many of the products that they’re speaking to understand, so they can easily understand that perspective. And then do the to work to understand how we got there, how the technologists built that.

And then technologists, on the other hand, need to be willing to have those conversations and those explanations and understand that lawyering of the past, there was the perception that lawyers were just gonna say no. Right? They’re risk averse, they aren’t gonna let you ideate and innovate, they’re just gonna shut it down. And that’s not really true.

My job as a lawyer and the jobs of lawyers at companies today, especially if they deal with technology and cyber issues, is to lay out the risk, understand the organization’s risk calculus, and to put the information in front of leadership so that they can make an informed decision and then help to build a cast-forward that calculates those risks, that mitigates those risks to the best of their ability and be ready to support the company in what they’ve done.

So, with that base level understanding and the willingness to do the work to understand, lawyers can be great assets to technologists because they can be translators, different communities, as well as the company builds out and understands what the risk posture is. It’s important to have all key stakeholders as part of that discussion, and lawyers are definitely part of that group.

Cindy Ng

So you talk about trust and doing your homework having a baseline knowledge of the other’s concepts and principles. What have you seen in your work that has worked that you’ve seen others reach over the aisle, and are you able to provide an example? And also, what doesn’t work?

Camille Stewart

I think the biggest catalyst for change is that things happen, right? So, a breach occurs, and you watch this organization scramble to figure out how to right itself after this big occurrence and realizing that the stakeholders that you were encouraged to have in the room initially were essential when this thing exploded.

And had you accounted for more perspective on the front end in a proactive way, it would have mitigated some of the risk on the back end or you would have been able to right yourself more quickly.

And so I think watching that occur has started a number of organizations and built a number of frameworks to help organizations get the right people in the room and encourage people to do the work to figure out where different players fall in the conversations that they’re having as an organization about how the security is evolving and how technology will be used and integrated in the organization. But I think that outside factors in this area of law and cyberspace evolving has done a lot of the work to encourage the collaboration that’s needed.

Continue reading the next post in "[Podcast] Cyber & Tech Attorney Camille Stewart"

[Podcast] I’m Sean Campbell, Systems Engineer at Varonis, and This is How...

[Podcast] I’m Sean Campbell, Systems Engineer at Varonis, and This is How I Work


Leave a review for our podcast & we'll send you a pack of infosec cards.

In April of 2013, after a short stint as a professional baseball player, Sean Campbell started working at Varonis as a Corporate Systems Engineer.

Currently a Systems Engineer for New York and New Jersey, he is responsible for uncovering and understanding the business requirements of both prospective and existing customers across a wide range of verticals. This involves many introductory presentations, proof of concept installations, integration expansion discussions, and even the technical development of Varonis channel partners. Sean also leads a team of subject matter experts(SME) for our innovative DatAlert platform.

According to his manager Ben Lui:

Sean Campbell is one of the most talented engineers on my team. He is the regional DatAlert SME and bridged valuable feedback from both customers and the field back to product management. Sean is also an excellent team player and excels at identifying critical data exposure during customer engagements. Overall, Sean is a key contributor to the Varonis organization.”

The fast paced environment, challenge of data security, and the fact that the sales cycle is far from “cookie cutter” is what Sean enjoys most about his role here. He also values the relationships he has been given the ability to build up over the years on both the Varonis and customer side.

Read on to learn more about Sean  – this time, in his own words.

What would people never guess you do in your role?

I’ve done a lot of interesting behind the scenes work – from creating new hire training materials to assisting with customer data breach investigations.

How has Varonis helped you in your career development?

I didn’t begin my career in sales, so my perspective on security was pretty narrow. Varonis has broadened that tremendously. I’ve developed the skills needed to tailor a conversation to different audiences whether it be a CISO, Cloud Admin, or a room full of other Sales Professionals. My technical skills have evolved as well, from basic Windows knowledge to more complex troubleshooting skills of the different platforms we support here. Pays a little better than minor league baseball!

What advice do you have for prospective candidates?

Humanize people, no matter the job title or status. Empathetic conversations begin and sustain smoothly that way. Be clear, be concise, be quick to listen and slow to speak! Something I’m always practicing.

What do you like most about the company? 

There is a maniacal yet focused approach with everyone here. We have crazy high standards for ourselves, but a culture of togetherness. You get things done and grow! I’m always excited to see what we are innovating next!

What’s the biggest data security problem your customers/prospects are faced with?

The elusive “starting point” or where to begin is a huge up front challenge. Everyone has data to protect, everyone typically has gaps, but similar to the NFL it’s all the same league just different playbooks. A successful playbook might resemble this one.

What certificates do you have?

Does birth count? Kidding, I have a security certification exam coming up next year. Wish me luck!

What’s your all-time favorite movie or tv show?

Movie is The Sandlot and there are too many TV Shows I like to just pick one.

If you could choose any place in the world to live, where would it be and why?

Just give me warm weather year round with easy access for family and friends to visit.

My wife and I are good with that. I also wouldn’t mind a golf course within walking distance.

What is the first thing you would buy if you won the lottery?

If it’s the big one, get me Richard Branson’s island broker.

What is your fave time hack?

My Bulldog, Banks. He hacks a lot of my time.

What’s your favorite quote?

Inky Johnson said something at this year’s SKO that resonated well – “It’s very easy to be busy and accomplish nothing” A good reminder to do things with purpose.

Interested in becoming Sean’s colleague? Check out our open positions, here!


Sean Campbell: Hi. My name is Sean Campbell and I’m currently a Systems Engineer for Strategic Accounts at Varonis, and this is how I work.

Cindy Ng: Thanks, Sean, for joining us today. How long have you been with Varonis?

Sean Campbell: I’ve been with Varonis, going on five years.

Cindy Ng: And what was your background prior to working at Varonis?

Sean Campbell: Before Varonis, I was actually playing professional baseball, and I was a baseball player in college while majoring in Information Technology. And after college, I’d gotten hurt my senior year so I didn’t pursue professional career right away. But I did rehab while actually working in Jersey City as a Security Analyst, and then I left that job to pursue my career in professional baseball, signed the contract. And then after I was released from the team, I had a little gap where I was deciding whether or not to continue playing or sharpen my resume and start looking for opportunities in security. And in the midst of all that, I was actually a counselor at the Boys and Girls Club. So I had gotten in touch with Varonis while debating continuing to play, and I decided to move forward with Varonis and I haven’t looked back.

Cindy Ng: Sounds great. What did you learn about yourself after working at Varonis?

Sean Campbell: Well, being personable comes pretty natural. I’m also grateful for the technical development that I’ve experienced here and the growth I’ve had in that area. But one thing I’ve noticed and I’ve learned about myself working at Varonis, was that I didn’t think I’d enjoy the challenge of a sales cycle as much as I do. My first job as a security analyst, had no selling at all, I was exposed to security tools and working with the team as far as cyber security strategy was concerned. And coming to Varonis as a vendor, it was a little bit of a learning process of what the sales cycle entails. It’s full of highs, lows, and in-betweens, but the challenge brings me back. It brings me back to my days competing on the baseball diamond and I think I’ve grown as a professional because of it.

Cindy Ng: What do you think the biggest data security problems our prospects are faced with?

Sean Campbell: I think collectively speaking, understanding what data is sensitive, who’s responsible for it and what’s a non-disruptive way to protect it.

Cindy Ng: So, take us through a customer’s operational journey from start to finish that you think might be helpful for our listeners to understand the important work you do.

Sean Campbell: So, I’ve done hundreds of demos and installations and worked alongside of our clients from a consultative perspective, but one that stands out in my mind, it was a media services agency. And some of the problems that we noticed, they had shared with us that they’d recently secured a large client from a competitor. It was sort of a big deal, really highlighting their growth as an organization. And while working through the terms of that contract, they decided in parallel to heighten their data security controls which actually included a Varonis data risk assessment. And within 30 days of that installation, we found over 200,000 social security numbers tied to client accounts, current and former employees of the firm. And this was amongst over 900,000 folders open to global access groups like everyone in domain users sitting on file servers.

Cindy Ng: Why were these problems a problem?

Sean Campbell: I mean, this is one of the things that as an engineer I really take pride in understanding. I mean, working with them to answer that question eases back to me.

So for this particular company, they manage advertising budgets and strategy for a who’s who’s list of some very high-profile clients. And this includes current and unreleased product marketing materials, other product strategy, very sensitive contract and financial information, among other forms of confidential data. And they are actually regularly audited by their clients to ensure that this information is handled properly. A blow to the trust of their clients because of a data breach or failed audits would mean loss business revenue.

Cindy Ng: And did they know they had problems?

Sean Campbell: They admitted to always knowing they had poor visibility into unstructured data. It was a gap they were aware of. And they knew that was our strength. So walking them through the process of identifying real areas of risk were eye openers. It helped us build credibility as a trusted advisor. They were receptive to our operational plan which actually stood out against other vendors they considered up to that point. And this project got visibility up to the CTO, so the overall response, it was something they needed to rectify moving forward.

Cindy Ng: And what were their use cases?

Sean Campbell: So a lot of the use cases really aligned to what we do, and it almost took the words out of my mouth in some instances. So some of the early things that I had jotted down and confirmed with my champion, they needed to better understand who had access to client project and HR data. And they had no efficient way to map all permissions across their file shares. They also wanted to monitor administrative changes, they wanted to be able to track both authorized and potentially malicious changes. So that meant auditing all file touches and having a way to detect or alert on something worth noting. They also mentioned that they were aware that they had sensitive data on their file servers. They knew that this data likely contained PII or other sensitive project information so they needed to understand where it live. So identifying where this data was, was another use case.

And then lastly, they were also prepared to begin the process of implementing a retention policy. So archiving data that they no longer needed to actually store on their file servers, and eliminating risk by just chopping off low hanging fruit in the form of unused or stale data archiving.

Cindy Ng: There’s a lot of competition out there. How did you convince them and how did they know that our methodology is the right one?

Sean Campbell: As I got to know my champion who had been a manager in IT at the time, he knew that their department was experiencing some growing pains. So as they kept hitting new revenue marks year over year, they weren’t really able to control from a data perspective how the data was growing, the rapid pace that…as it was being created, as the business was demanding new technologies to be rolled out, keeping up with the ability to secure those very technologies once they’re live and in production. So they had really no repeatable process, and especially as it pertains to data governance, it was a huge challenge. So they’d been relying on third-parties to do things like pen testing, network upgrades, just to keep the lights on, to keep productivity as efficient as they could.

Unstructured data, so information that’s living on file server, email, SharePoint, for example, this was an afterthought and they didn’t really have the manpower to attack it. So our methodology really handled a lot of the heavy lifting for them in a single pane of glass. So the solutions they invested in initially provided them a non-intrusive or destructive way of securing their most out-of-control asset which was the data living on file servers.

Cindy Ng: Ah, so you essentially told them where their data was in order to have proper data governance, you need to know where it’s at.

Sean Campbell: You got that right.

Cindy Ng: So which products did they buy?

Sean Campbell: I think this was truly a platform sale. They purchased DatAdvantage for Windows, Directory Services, our Data Classification Engine, our Data Transport Engine, DatAlert Suite and Automation Engine. And this was a sweet spot for our customer base as well, a thousand plus users.

Cindy Ng: And are you able to describe what fixing their problems looked like?

Sean Campbell: So I can tell you that each one of those solutions worked in tandem to correct the very complex problem. With that visibility into where the sensitive data is stored and where it’s at risk, they actually were able to now leverage our professional services to put a plan in place to fix it. So I really teed up for our services team to work with them to do things like locking down open shares, setting up reports, identify stale data for archival, and alerting on inadvertent or suspicious anomalies with products like DatAlert Suite.

Cindy Ng: And so are you able to say that you helped them achieve their data security goals? How did they know that they were progressing in a way that is helpful for them?

Sean Campbell: That’s a great question, and if promotion to Senior Vice President is any indication of my champion’s ability to present the business with a solution for securing their data, I think we gave them a lot of confidence with the solutions that they invested in. Because I didn’t have as much visibility into the account within the recent, I’d say past month or so because of the territory shift, I can’t give you actual metrics, but I’m confident that they cleaned up the open shares that we helped identify and they were able to determine where they have sensitive data, who has access to it, and who shouldn’t have access to it. I can also say that it was a very straightforward process to building out that game plan and making it a repeatable process and getting them to a more manageable state. And this began with simple reporting, showing off the ease of automation through solutions like Data Transport and Automation Engine, that Data Classification Engine, the out-the-box rules gave them less work to do, quickly identifying at-risk sensitive information. We were even able to highlight things like what’s your ransomware response readiness.

So, we even simulated ransomware to show the ability of our DatAlert Suite and how it can detect alert and arrest that form of malware. That really drove the overall aspect of the sale in my opinion. Single pane of glass, the automation, giving them less work in the midst of everything else that they have going on.

Cindy Ng: Ah, so they weren’t even really looking for ransomware prevention, but as sort of a byproduct of getting all the different products, you were able to help them with prevention and you’ve been able to help them find other ways to help their infrastructure be even more secure.

Sean Campbell: You got that right. I would say as that trusted advisor that I strive to be for a lot of my clients, preaching detect, prevent and sustain has been something that some of my customers have even adopted for their own teams, and that methodology that we helped walk them through as they’re, I would say, rounding up that uncontrolled, unstructured data environment, it really puts that process in place. So when we do things like detection, it makes it easier now to set up things like prevention. And in long-term, you’re just sustaining that. And helping them along the way makes that process, again, very manageable and it gets the state of their environment to a manageable place as well.

Cindy Ng: So after you help our customers go through the operational journey, how do you feel from a professional and personal perspective?

Sean Campbell: That actually brings me back to the first or second question you asked me in terms of what did I learn about myself. That’s one of the things that continue to drive me. I use those experiences in the next engagement that I encounter, and it’s able to really help me find my own repeatable process that I know works. Now that’s not to say it’s easy every single time, but when we get to that state where the outcome is evident that our operational plan, that our methodology works, really gratifying in that sense and it builds that confidence as an engineer that Varonis is truly the only solution out there that can help organizations really manage and protect their own structured data.

And as we start to get into areas of enrichment, I’m very confident that we can add even more value gathering things like perimeter insights and things like geolocation for example, which really it’s really gonna arm our clients to take a more secure approach to how they protect their data at the end of the day and keep that bottom line in the black.

Cindy Ng: You mentioned you played baseball prior to working at Varonis. Do you still get a chance to play for fun?

Sean Campbell: Oh, in my head. I haven’t had the time in a while, I should say that should get back out there, but I’m always looking for opportunities. There’re a few leagues that I’ve had my eyes on, I’ve been in touch, but I do keep tabs on some friends of mine that are still playing professionally and I always keep in touch with some of my teammates and just talking baseball kinda keeps me close to it.

Cindy Ng: Yeah, you even…you even used tee up earlier when you were describing your work, that baseball is still in your root. Do you have a favorite book?

Sean Campbell: There’s another interesting segue back to baseball. So my favorite was written by a guy by the name of Harvey Dorfman, it’s called “The Mental ABCs of Pitching.” It was a performance guide I’d read in college and I actually read it again in professional baseball. It was supposed to help propel me into a long career in the Major League, but ironically, a lot of the principles Harvey described, I still use in my career today. Slightly different paths but the same rules apply.

Cindy Ng: What were some of the principles that really helped guide you in your work?

Sean Campbell: So Harvey breaks the book down into different chapters, and the chapters simply have a keyword and he’ll describe that aspect of that keyword as it relates to pitching, so I was a pitcher. So for example, he’s got a chapter called Discipline, he’s got a chapter called Approach, Confidence, Self-esteem, Control. These are all things where he’ll break down the mental aspect of pitching and how it relates to that keyword for that particular chapter, as a matter of fact.

So I’ll give you example from the chapter on discipline, and he starts by just defining what discipline is and its training that’s expected to produce a specified character or pattern of behavior, especially that which is expected to produce moral or mental improvement. The interesting thing about pitching, it’s a very mental aspect of baseball. You’ve gotta have mental toughness, you gotta be precise, you have to trust what you’re doing, and how that relates back into being an engineer at Varonis, a lotta times, you know, your prospects or your customers, they can see right through simply being sold, you know, especially if you’re not confident with what you’re saying, if there is a low trust into what you’re trying to sell them on, especially if it’s just buy, buy, buy the product, figure everything else out later.

As an engineer, as I prepare, a lot of times I take the same approach that I used to have with baseball in my career as a sales engineer. That means learning up on the latest technologies, learning up on the latest threats and understanding how those threats augment the value we bring to the table and knowing when I walk into a room, I can identify a lot of the problem that they may be challenged by, listening to what they’re saying, taking back that information and formulating a plan. And that same process used to be how I would develop as a pitcher. I’d understand the team that we’re about to face, the strengths and weaknesses of those hitters, and how I would attack those hitters during that game that I was scheduled to pitch. So I knew if this particular person couldn’t hit a curveball, guess what, he was gonna see a lot of curveballs.

If an organization has poor visibility into who’s touching their file share data, well, guess what, we’re gonna augment where that file share data is, we’re gonna turn on auditing and we’re gonna break down for them how can we easily report on and alert on any anomalies with this information.

Or if we’re struggling to meet a regulation, you know, I’m gonna understand what that regulation is and I’m gonna put a preparation, I’m gonna put a plan of preparation in place to say, here’s how we can help you better meet that regulation, for example. So the reminders in this book a lot of times just keep me on that path of how to properly approach being a sales engineer. So it’s been very interesting. I didn’t think it could relate in that way, but again, that goes back to I never expected to enjoy the sales cycle the way I do. Because it’s not simply just straight B2B sales, “Hey, here’s a product, here’s the SKU number, here’s how much it costs. That’ll do it.”

There’s a process to it of understanding what the use cases are, like you asked me. Understanding, you know, why are these problems problem, right? And then really augmenting or walking them through our operational plan, our methodology to show how we’re best positioned as a solution.


[Podcast] I’m Brian Vecci, Technical Evangelist at Varonis, and This is H...

[Podcast] I’m Brian Vecci, Technical Evangelist at Varonis, and This is How I Work


Leave a review for our podcast & we'll send you a pack of infosec cards.

If you’ve ever seen Technical Evangelist Brian Vecci present, his passion for Varonis is palpable. He makes presenting look effortless and easy, but as we all know, excellence requires a complete devotion to the craft. I recently spoke to him to gain insight into his work and to shed light on his process as a presenter.

“When I first started presenting for Varonis, I’d have the presentation open on one half of the screen and Evernote open on the other half and actually write out every word I was going to say for each slide,” said Brian.

From there, he improvises from the script.

“I’d often change things up while presenting based on people’s reactions or questions, but the process of actually writing everything out first made responding and reacting and changing the presentation a lot easier. I still do that, especially for new presentations.”

According to Varonis CMO David Gibson:

Brian’s high energy, curiosity, and multi-faceted skills – technical aptitude, communication skills, sales acumen, and organizational capabilities -make him an exceptional evangelist.

Read on to learn more about Brian – this time, in his own words.

What would people never guess you do in your role?

I’m really lucky that my role at Varonis lets me engage with people all over the company, including Marketing, Sales, Support, Engineering, and Product Management, so I’m not sure that there’s anything anyone would never guess about what I do.

When it comes to the more public aspects of what I do, like press, Connect events, and customer meetings, I spend more time drilling and practicing what I’m going to say so that when I’m on stage or in front of a camera, I can improvise off a script rather than trying to remember what I’m supposed to be talking about.

What did you learn about yourself after working at Varonis?

That I need to spend more time listening and less time talking. One of my first trips I made at Varonis was going to a few customer meetings in California and before I left David Gibson reminded me to “make the meeting about them,” meaning the people I was meeting with. It’s still something I’m working to get better at and have to consistently remind myself of.

How has Varonis helped you in your career development?

It would be hard to come up with ways that Varonis hasn’t helped me in my career.

I’ve become way more confident in front of audiences. I’ve gotten better at confidently talking about things I know well and I’ve gotten more comfortable with saying, “I don’t know.”

I was always in technical roles before coming to Varonis and sometimes it’s hard to admit that you don’t know something when it’s your job to.

What advice do you have for prospective candidates?

Varonis more than anywhere else I’ve ever worked rewards energy, enthusiasm, and hard work.

We’re much bigger than we were when I joined back in 2010, but there’s still so many things that we’re learning how to do well as a company.

The people who succeed here are the ones that do, fail, and get better.

What do you like most about the company? 

I admire how much of our leadership has been here for so long, and I think that’s reflective of everyone having the same goal.

It’s been rare in my career before coming to Varonis to feel like a part of an organization on a mission. That’s never been an issue here.

I know what it’s like to work somewhere where the leaders have no vision, let alone the ability to execute on it.

What’s the biggest data security problem your prospects are faced with?

When I first got here we were spending a lot of time just teaching our prospects that security on file systems was possible!

Making sure the right people had access to what they were supposed to was an impossible problem to solve for so many people for so long that we had to spend a lot of time just education people that we understood the root of their problems and could actually fix them.

These days everyone seems to know it’s a problem and the biggest challenge our prospects face is knowing how to get there.

“I get what you (Varonis) do, but tell me how we can actually get there” is something I hear a lot. That’s probably because I spend a lot of time talking about our Operational Journey these days.

What certificates do you have?

I’ve got a CISSP, which is the only certification I ever put a lot of work into.

Fave book?

I love to read and have a bunch. I read The Count of Monte Cristo every few years, so that’s up there. Dune is another one that I try and read every now and then. Gateway by Frederick Pohl as well. The book that helped me most with my job is Working with Emotional Intelligence by Daniel Coleman.

What is your fave time hack?

Adding my flights and hotels to my wife’s Gmail calendar because what do you mean you didn’t know I was going to be in London this week?

What’s your favorite quote?

Decisions are made by those who show up. I’m not sure who to attribute it to, but the first person I remember saying it to me was my father.

Interested in becoming Brian’s colleague? Check out our open positions, here!

Brian Vecci

Hi, my name is Brian Vecci and I’m currently a technical evangelist at Varonis, and this is how I work.

Cindy Ng

Thanks, Brian, for joining us today. How long have you been with Varonis?

Brian Vecci

That’s an interesting question. I’ve been with Varonis since March of 2010. But as some or many people may know, I actually left for about 10 months before coming back. I’m in my second term at Varonis, and I’ve been here now for…in my second stint for about two and a half years. But when I introduce myself I say I’ve been here since 2010.

Cindy Ng

What was your background prior to joining Varonis?

Brian Vecci

I went to college and studied computer science and music. And I came out of college and immediately went to work as a web developer. So I was an engineer, and I spent time doing web and applications development. And I discovered that I’m generally better at talking about the kinds of things that I was doing and helping other people understand the technology that I was building than actually building the technology which people that know me probably won’t surprise anybody.

So I was an engineer, an applications developer then I moved into project management. I was a project manager for a while, a systems architect. And right before I came to Varonis, I was in desktop architecture for an investment bank. And before that I had done project management at a law firm and I’d been in a publishing company. So I’d kind of been in IT and IT applications and a few different roles and hopped around a few different industries before coming to Varonis.

Cindy Ng

And how did you know that Varonis was a good fit for you?

Brian Vecci

I knew immediately that Varonis was a good fit for me because I needed a job and they offered me a job. So the fact that I got a job offer was the first big clue but really I connected with an old manager of mine at a law firm, Chadbourne & Parke who’s one of the best managers that I’d ever had up until that point, introduced me. He know I was looking for a job and introduced me to a friend of his at another law firm who had a friend who worked for this tiny startup company called Varonis who was looking for someone to do what they were calling technical marketing which is something that I’d never done before.

And so I interviewed with this guy, his name is David Gibson, and he was a former SE and was looking for someone technical, and I met him and we got along great. And then a couple of days later I met a guy named Mark Wilcox and we got along really well, and a couple of days later I sat in a windowless conference room in New York City, then a couple of guys named Ken Spinner and Jim O’Boyle and a few. About 30 minutes into that meeting I met a guy name Yaki Faitelson, and every single person that I met along the way was passionate and enthusiastic and super intelligent and seemed to work really hard and really believed really strongly in what they were doing, and I had no idea what we were doing at that point. I didn’t really know what Varonis did. I had some kind of inkling.

So it was less the company itself and more the people that I was about to start working with that made me pretty confident that this was gonna be a good fit and it turned out to be right.

Cindy Ng

And what did you learn about yourself after working at Varonis?

Brian Vecci

That I need to spend way less time talking and way more time listening. It’s one of the first lessons that David tried to impart on me. I remember before in one of my first trips out to do some customer meetings, he said to me, “You know, Brian, you’ve got to always remember make the meeting about them not about you.” And anybody who knows me well will hear me say that out loud and laugh at me because they realize that’s still something that I struggle with sometimes.

But learning how to shut up and listen, have a little bit of empathy and think about the people that you’re talking to and what they care about was one of the hardest lessons for me to learn because it’s something that I’m not naturally good at but it’s something that stuck with me for eight years and something that I continue to work on. I think about it as something that I’m hopefully a little bit better at than I used to be and that I continue to improve on. And every time I’m mindful and focused on listening to others I find that I get better at what I do and feel better about what I do.

Cindy Ng

And when you go to a meeting, when you talk to them, what is the biggest data security problem your prospects are faced with?

Brian Vecci

Well, I spend a lot of time in meetings talking these days about our operational journey. And that means the biggest data security problem, the prospects that I’m talking to when I’m talking to, the biggest problem that they face is, they know they have a big problem. They know they have a ton of data.

They may know that some of it is sensitive, they may not, they may have some ideas of where it is, they may have some sense of the scale of the problem that they’re facing trying to help the right people have access to the right data but the biggest problem they face is, “All right. We know we have these huge problems, we get it. How do we get there? How do we go from the state where everything is chaos to this vision that you’re talking about where only the right people have access to just what they’re supposed to and everything’s monitored. When something goes wrong we know about it?”

So the biggest problem these days is just how to get there. It’s less about a specific technical problem and more about, “I don’t know what I need to do first, second, third or fourth,” which is really different like even when you and I started here. Like seven, eight years ago the biggest problem that we faced was that our prospects had no idea that they had these problems. We spent so much time just educating people first of all, unstructured data or data on file systems is important and it was exposed and they had no idea how big of a problem they had, let alone what they needed to do to fix it. That’s changed. These days most people know that they have a big problem, they just don’t know how to get there.

So what I’m finding is when I am talking to a prospect it’s because they wanna learn about, you know, what our operational journey looks like. Those are words that we use, but what it really means is, “I know I have big problems. I have a sense that you can help me. How can we actually get to the state that you’re talking about?” If that makes some sense

Cindy Ng

Yeah. Take us through an operational journey from start to finish that you think might be helpful for our listeners to understand the important work you do. Let’s start with verticals. Do verticals matter? Does this journey apply to every company?

Brian Vecci

I think the journey applies to every company because every company has data but that doesn’t mean that verticals don’t matter. Verticals do matter because the ways a bank thinks about their data because they’re so highly regulated, because they know they’ve got, for instance, customer information, that if it was exposed or leaked improperly could result in big fines, the kinds of things highly regulated industries think about when it comes to their data are a little bit different than, for instance, a media company or somebody who’s not as regulated.

Everybody’s got the same problems but the vertical can really dictate sometimes how a prospect thinks about or even talks about their data. That said, the operational journey, it’s pretty much the same. We don’t have to change what our journey looks like depending on the vertical. Everybody gets a lot of data, and if they’ve never worked with Varonis before I’m pretty sure they don’t really have a handle on what kind of data they have, meaning what sensitive and what’s not. They really don’t have a handle on where it all is.

They’re probably not monitoring how it’s used. There’s a sound bite that I use often, you can’t catch what you can’t see, and you can’t manage what you don’t monitor, which sounds trite but are absolutely true. It’s really difficult to make decisions about something when you know nothing about it and so many companies know nothing about their data.

So the journey starts with, and this is gonna sound kind of sales-y because we spend a lot of time building content for Salesforce to learn, but turning on the light, just helping somebody understand, “Listen, here’s where your data is. Here’s who got access to it. Here’s what’s sensitive, here’s where it’s exposed, and look, here’s how it’s being used.” And when you do that, when you just start with that you’re often so much further ahead than you were before.

The journey then kind of moves on to not only understanding what you’ve got but fixing the biggest problems. When you turn on the lights you can start to prioritize and understand where you’re exposed and where you’re at risk.

One of the things that I talk a lot about, many of the presentations that I give is that risk is a pretty simple equation. It’s how valuable is something and how likely is it that something’s gonna go wrong with that asset or that data? So how valuable is our data? What’s the likelihood that it’s gonna get lost or stolen or misused? And our operational…a big part of our operational journey is helping our prospects to quantify that.

How many folders do you have that have sensitive data that are exposed to many people, that are exposed to global access groups? That’s easy for us to put numbers behind, very hard for someone to do without Varonis. But once you understand where you’re exposed, we call it prevent. We detect and then prevent, but preventing disaster means reducing exposure, making sure only the right people have access to what they’re supposed to, locking down sensitive data, getting rid of global access, and starting to figure out who this data belongs to so that you can get them involved in making decisions.

Finally, the last step of the journey is to automate things like entitlement reviews. Why should somebody at the helpdesk or somebody in security or somebody in IT be making regular decisions about who should and shouldn’t have access? It’s the data owners, it’s the people who understand and have real context that should be.

So automating entitlement reviews, automating authorization workflows, automating quarantining and retention and disposition, these are all kind of technical ways of saying, “Once you understand your data and you lock it down, you can start to treat it like you would anything else that’s valuable,” and Varonis can help you do that in an automated way so that you’re not going through endless projects for annual clean-ups and things like that, which is what we see our prospects either are doing or have done in the past in trying to solve some of these problems.

Cindy Ng

So how can you turn on the lights for our customers? How do they acknowledge their problem? Do they know that they have problems? How do they respond?

Brian Vecci

Customers who or prospects, I should say, who we do risk assessment for and we’re completely shocked by what we found. I hear stories a lot of sale teams being kicked out of the room when somebody says, “You know what? We had no idea that this much sensitive data was this exposed, that you can’t see this, like we could all get in a lot of trouble, you have to leave the room.” So sometimes it’s really surprising.

Other times and this is becoming more common these days, a prospect will know that they have a big problem but they didn’t realize maybe the extent of it or they’ve never seen it presented in such a comprehensive way. Our risk assessments are so valuable, and it’s one of the reasons we talk about or evaluations or our proofs of concept as a risk assessment these days because that’s really what they are.

We can go in and give somebody a pretty clear picture of what their environment looks like without a whole lot of work. We can tell them concretely, “Here’s how much data you have, here’s how much of it is sensitive and here’s how much of it is open. Here’s literally how much risk you’re facing right now and here’s how you can kind of fix all these problems.”

So, to answer it, I think your question is, “Do they know it’s a problem?” Sometimes they do, sometimes they don’t. Oftentimes they have no idea of the real scale of the problem or even if they do know they have a big problem it’s still eye-opening for us to do a risk assessment and show them really specifically exactly where the problems are and how they can actually fix them.

Cindy Ng

So after they kick you out and hopefully they bring you back in and that you try to convince them that our methodology is the right one to follow, how do you convince them that there’s so many solutions to a problem? Why is the Varonis way the right way?

Brian Vecci

I’m going to disagree with you that there’s so many solutions to a problem because this particular problem, especially when we’re talking about a data stores like file systems that are pretty chaotic, there aren’t a lot of solutions to that problem.

What we’re very fortunate in that Varonis has technology that’s unique. Nobody else does what we do the way that we do it. And I can speak from personal experience. Having spent some time at one of our competitors, nobody else does what we do the way that we do it. So when we can come in and present not just, “Hey, look, we showed you, you have a big problem, but we showed you you have a big problem and we have the technology to help you solve it, and we have the track record and experience to show you that we’re good at actually doing this.” Our methodology, it’s not pie in the sky, it’s not in theory. We’ve got more than 6250 as our last earnings call.

That’s a lot of customers who have used Varonis to actually solve some of these problems. So our methodology is based on experience and that carries a lot of weight. There’s lots of ways to solve this problem, it’s really, in our experience, there’s very, very few ways to solve this problem, and we’re fortunate enough that if you wanna solve it you need not only a methodology to do it, you need an approach, you need technology to enable that approach to actually work.

And I speak honestly in my experience, Varonis is the only way to do it, which it’s a lot of fun to work for a place where you can not only identify a big problem but help people solve it and you’re the only ones that can do it. We’re in a really unique situation.

Cindy Ng

What do they initially buy when they decide that Varonis is the only way?

Brian Vecci

Everybody has Windows data or CIFs data, whether it’s NAS or on Windows File Servers. So, most commonly it’s DatAdvantage for Windows because that’s what gives you the ability to not only monitor everything but map all of the identities and all of the permissions. That’s pretty critical to turning on the lights. Another big part of turning on the lights is understanding where sensitive data is. So data classification. And our data classification engine is kind of a no-brainer. So that’s a big…that’s a pretty common piece of that initial package.

And then the great thing about DatAlert and DatAlert suite is that it becomes more powerful the more ingredients, the more we call them behavior streams or metadata streams that you give it. The more information the DatAlert has to analyze and alert you the more valuable it is. So with DatAdvantage for Windows you’re mapping permissions, you’re monitoring Windows data and access activity for the users on that data. Data classification gives you some context in what’s sensitive and what’s not which is really important.

And Directory Services allows you to monitor Active Directory too, everybody has Active Directory. So those I think are the most common but I wanna be careful about saying what are, you know, our most common package is.

Cindy Ng

And then how do you quantify the improvement so that customers know that you’ve helped them and they wanna continue the journey with you?

Brian Vecci

It’s a really excellent question. And it’s a big part of our risk assessment, is to quantify what their risk is, what their risk profile is. And we quantify that by how much data do you have? How much of that is sensitive and how much of that is open? And if you just track those things, “All right. How many folders do I have? How many of those folders are open to everybody or, you know, open to lots of people? How many of those folders that are open are also contain sensitive information?”

If you take that number and you start tracking it over time and you see the number of, you know, folders that are sensitive and open and you see that number going down, you see the number of folders that are stale and you see that number going down because you’re deleting or archiving it, you see the number of things like users who are enabled but not active, or users that have passwords set not to expire or the number of file system artifacts like orphaned SIDs or individuals on access control lists or the number of issues that we find in Active Directory, there’s lots of really specific metrics that only we can measure, and I say only we because we’re the only ones that have the ability to scan every single folder and subfolder and every single sharepoint site and sub-sites, and we monitor every single data touch. We’re the only ones that can really do that especially at scale.

We can start to put really specific metrics behind, “All right. Here’s what you’ve got. Here’s where you’re at risk, and here’s how you can measure the improvement over time.” And that’s what we show our prospect in a risk assessment, and hopefully, that’s what we’re tracking as they go through our operational journey.

Cindy Ng

And describe what utopia would look like in a company’s file system?

Brian Vecci

I would say, here’s what utopia looks like, and this is part of a lot of the presentations that I give these days. Like what is the Varonis’ vision for how you can think about your data? And it’s pretty straightforward. You know where all your sensitive data is, you can make sure that only the right people have access to it, and really, people, users only have access to what they’re supposed to, that everything is monitored. Every time someone touches data it’s monitored and recorded.

So just like how a bank has a pretty good idea when your credit card is being misused because they know a lot about you, right? They know who you are, they know where you live, they know what you shop for, they know in the amounts that you shop for and where you shop, and really, really critically, they watch every dollar that goes in and out of your account because that’s their business.

Well, you can start to treat data that way if you know everything about your users and what they have access to and where sensitive data is and really critically, you watch every time someone opens, creates, moves, modify it and deletes data, you can start to treat your data like a bank treats your credit card, and that means you know when something goes wrong.

So not only do you know where your sensitive data is and you can make sure the right people have access to it but you also watch everything that every user in every service account does. So you know what’s normal and then you know what’s abnormal, and if something goes wrong you can respond to it intelligently and really really quickly. And then you can automate things like retention and dispositions.

And what that means is, when you don’t need data anymore you can delete it, archive it, move it somewhere else. If somebody put something sensitive where it’s not supposed to be, you’ve got automation in place to quarantine it. Somebody drops a sensitive file in an open share, it automatically gets moved somewhere else, that’s locked down and properly protected.

You know who data belongs to and you’ve got those owners involved. So when someone needs access to data it’s your data owners that are saying yes or no, and that whole process is recorded. The data owners are reviewing access on a regular basis. They’re doing access recertification, we call them entitlement reviews.

So once a quarter your owners are looking at who has access to the data and they’re making decisions about who should and shouldn’t have access to data. And then from a compliance standpoint, not only do you know what’s happening to your data and you know what’s sensitive, and you can make sure that it’s locked down, but when someone needs access to it you’ve got a record of who asked for it, who approved it, when they approved it, why they approved it because you’ve got DatAvantage monitoring everything for every single thing that they did while they had that data.

The vision is just to start treating data like a smart company treats anything else that’s valuable. And the biggest journey that we’ve been on as a company over the last…since I’ve been here since the last…in the last eight years, it’s helping the rest of the world understand just how valuable this data is and that it’s possible to put the kind of controls and protections and processes around file systems as they do anything else that’s really valuable in the company.

Cindy Ng

What other byproducts have you been able to help our customers find since they were looking to achieve these privilege model? Where they able to find other solutions that they didn’t initially realize that Varonis helped them with?

Brian Vecci

As for the kinds of things that companies tend to discover and the kind of use cases that gets opened up, but once you start treating data this way you can start connecting things like your SIM to your file systems, which is a…it’s really, really difficult to do unless you’ve got Varonis, by sending alerts from DatAlert off to the SIM for instance or connecting identity management to your file systems.

Cindy Ng

Outside of work when you’re not presenting or traveling to another meeting, what do you like to do?

Brian Vecci

I like to read a lot and I spend a lot of time on planes so I spend a lot of time reading. I play the guitar and I’m pretty confident that’s one of the reasons that David Gibson hired me, was that I was a guitar player. I have a little home studio in my basement. I recently moved from Brooklyn out to New Jersey. And I’ve been joking with a lot of people that I bought a farm. I didn’t actually buy a farm although I looked at it, but I’m just spending a lot of time learning what it’s like to own and run a house.

Having a house and having a kind of a big piece of property is something that’s new to me. So over the last year, really, the last six, eight months since I’ve done that, I’ve been learning a lot about what it means to kind of be a homeowner, which is exciting and fun and may sound l kind of pedestrian and not as exciting as some of the other stuff that I get to do, but for me, it’s been really, really interesting.

Cindy Ng

Well, thank you so much, Brian. And we wish you the best.

Brian Vecci

Thank you. It’s been great talking to you. And, Cindy, it’s been great working with you for the past eight years. And when did you join Varonis? You were the first person that was hired in our team after I joined.

Cindy Ng

It was 2010.

Brian Vecci

Yeah, 2010. So we’ve been here for a while. It’s been great working with you and I look forward to lots more in the future.

[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and ...

[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk, Part Two

This article is part of the series "[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk". Check out the rest:


Leave a review for our podcast & we'll send you a pack of infosec cards.

In part two of my interview with Varonis CFO & COO Guy Melamed, we get into the specifics with data breaches, breach notification and the stock price.

What’s clear from our conversation is that you can no longer ignore the risks of a potential breach. There are many ways you can reduce risk. However, if you choose not to take action, minimally, at least have a conversation about it.

Also, around 5:11, I asked a question about IT pros who might need some help getting budget. There’s a story that might help.

Do Data Breaches Impact the Stock Price?

Guy Melamed

My name’s Guy Melamed, CFO and COO for Varonis. I’ve been with the company since 2011, in charge of all the financial statements and execution of strategic operational plans, in charge of the legal department, and IR as well. And kind of enjoying the ride.

Cindy Ng

There’s a discrepancy online where there’ve been studies that say that breaches don’t impact the stock price. Sure, a breach will typically lead to a one-time large expense or maybe smaller reoccurring expenses. There might be a potential decrease in revenue, but in the long term, investors tend to look past the breach, and they really just focus on the strength of the business and the value of the company. What do you think about data breaches and their impact on the stock price?

Guy Melamed

I’m not so qualified to talk about statistics on stock price and how a breach would affect a stock price in the short term or in the long term. What I can say is that what we’ve seen in so many events, in so many breaches that have taken place in the last couple of years, is that if you go back to those companies, and ask them would they have rather dealt with a breach or just buy a software, take measures that can help them in protecting or preventing or minimizing the amount and the magnitude of the breach, I think the answer is pretty obvious.

So we’ve seen companies that have gone out of business because of breaches. We’ve seen companies that will have to deal with litigation for years ahead. So where’s that factored in? There’s just so many components. It’s more of a philosophy that if you can do something active to try and minimize risk, then why not do it?

I think companies, more from a philosophical perspective, should try and actively take action in order to minimize risk. And companies that are under the belief that it won’t affect them and that they’re going to be okay, I think are acting slightly irresponsible.

Data Breaches and Breach Notification

Cindy Ng

Let’s talk about breach notification. It’s said that the time to discovery increases the cost of a data breach, and research has said that most companies take over six months to detect data breaches. If you’re in the EU, article 31 of the GDPR says that data controllers, they’ll need to notify authorities of a breach within 72 hours at the latest upon learning about the exposure, if it results in a risk to a consumer. If you’re already protecting or in the process of protecting your data, how do you reconcile the time in figuring this element out? What do companies need to do? How much are we talking about?

Guy Melamed

So the surveys that we’ve been tracking show that 70% of the beaches are discovered within months or years. And I think a great example of a breach that affected a company years later was a Yahoo deal. This was a breach and I don’t know if it was four years ago or three or five years ago, but it was discovered as part of an M&A process and had an effect, an actual quantifiable number that impacted the transaction price.

So a company would obviously rather try and identify breaches as soon as possible, so they can take action, minimize some of the cost and be transparent with both the customers, the investors, and the shareholders.

GDPR definitely changes the reporting requirement, and if you’re breached, you have to provide that information within 72 hours. That’s a short period of time, and in order to be able to comply with that regulation, and in order to have better tracking, you really have to have systems, programs, personnel in place to try to identify this.

And the fines that come from GDPR, I’m talking about, you know, some of the requirements and some of the fines related to those requirements, are 4% of global revenue or $25 million, whichever is greater. That’s a huge number that could affect companies in so many ways, definitely something that from our perspective what we see is causing a lot of interest, causing a lot of discussion, and companies are not ignoring the regulation because of its significance.

Should You Just Pay the Fine?

Cindy Ng

So when you’ve done the risk analysis of viewing the GDPR fines, companies resigned to paying a fine because the fine isn’t that costly. And so let’s just pay the fine and get it over with.

Guy Melamed

My response is that it probably fits with an analysis or an analogy that says if I go through a red light, I know that the fine is probably minimal and I can live with a fine. There’s so many other consequences. First of all, there’s, the fine is pretty large when it comes to GDPR.

There’re so many other components that thinking that you can be okay, and just by paying the fine and being breached is definitely not the action that I would like to take as the company’s CFO and definitely would try and act in a way that would minimize the risk long term and short term.

A Story that Might Help IT Pros Get Budget

Cindy Ng

And what are your tips for IT managers who are trying to get budget to get a data security solution they need to help prevent a breach?

Guy Melamed

So I’m not sure I’m qualified to give tips, but I will share a story that I heard from one of our customers.

And during a discussion, he was asked, “What is the best way to get budget, in order to get the Varonis product or any other product for that matter that can protect the company in the long term?”

And his response was, “Make sure the risk assessment, the evaluation and whatever you’re doing in that demo is done on the finance documents. If the finance personnel, if the CFO can see how many people have access to the financial statements or any other sensitive information within his folders or her folders and have access to information they shouldn’t have access to, you’ll find the budget, they’ll find the budget.”

So that’s definitely something that I I could relate because if I would see risk on files that I know team members shouldn’t have access to, we could move things around within the budget to have something purchased that wasn’t necessarily budget initially when I can quantify the risk in my mind.

Minimally, You Should Have a Discussion

Cindy Ng

And any final thought as CFO as it relates to the cost if you don’t invest in security?

Guy Melamed

I think no one anymore can ignore the risk. I think three, four, five years ago, we would talk to companies, show them the risk assessment, show how vulnerable they are, how many sensitive files are open to everyone in the company, show them how much data is open to everyone.

And people could live with the risk. I don’t think people, after all the breaches that have taken place and the amount of risks that companies are dealing with, can ignore it anymore. I think they have to take measures, think about it, or at least have a discussion. If they decide that they want to live with the risk, it should definitely be done after discussion with the legal department, the HR department, CEO, CFO, CISO, if all parties agree that the risk is not worth doing any, taking any action, then at least you had a conversation.

But if it’s decided by one person within the organization and it’s not shared between the different departments, between the different roles that would eventually be responsible, then I think that’s just not good practice.

[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and ...

[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk, Part One

This article is part of the series "[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk". Check out the rest:


Leave a review for our podcast & we'll send you a pack of infosec cards.

Recently, the SEC issued guidance on cybersecurity disclosures, requesting public companies to report data security risk and incidents that have a “material impact” for which reasonable investors would want to know about.

How does the latest guidance impact a CFO’s responsibility in preventing data breaches?  Luckily, I was able to speak with Varonis’ CFO and COO Guy Melamed on his perspective.

In part one of my interview with Guy, we discuss the role a CFO has in preventing insider threats and cyberattacks and why companies might not take action until they see how vulnerable they are with their own data.

An interview well worth your time, by the end of the podcast, you’ll have a better understanding of what IT pros, finance, legal and HR have on their minds.

Data security and the CFO: Risk and Responsibility

Guy Melamed

My name is Guy Melamed, CFO and COO for Varonis. I have been with the company since 2011. In charge of all the financial statements and execution of strategic operational plans. In charge of the legal department and IR as well and I am enjoying the ride.

Cindy Ng

Sounds great. So, today we’re gonna be discussing how much it would cost if we don’t invest in data security, and let’s start with the role of a CFO.

Right now, data breaches are one of the biggest threats that all companies face, and companies are realizing this and increasingly, they’re delegating responsibilities to the CFO. According to a survey by the American Institute of CPAs, 72% of companies, they’ve asked the finance department to take on more of a responsibility to deal with data breaches and attacks. Why should the CFO be involved in protecting the organization’s most sensitive data?

Guy Melamed

I think the answer is comprised of a couple of components. One of them has to do with the fact that CFOs are responsible for the financial statements and with recent events and with the amount of breaches that have taken place, there’s much more emphasis on the type of disclosure the company has to provide as part of the 10-K and as part of the risk factors and even as part of the MD&A. Just to give you an example, in recent months, the SEC has provided guidance on cybersecurity, board consideration, and the amount of disclosure that needs to be provided. And just to give you a sense in the release, that, as a side note, was provided by the SEC chairman, post the breach that took place in the EDGAR system which is a system that you can log in and see all of the financial statements of all companies, and there was a breach in that system and as a result the SEC had to address from a disclosure perspective what was taken and how they’re addressing that event and future events and planning to protect any future event.

So, that kind of created the guidance that was provided to all of the big four accounting firms, and private, and especially public companies have to address that. That release talks about what is company doing from a risk management perspective, how are they protecting against cybersecurity? It talks about the board’s role in overseeing the management and any immaterial cybersecurity risk. And it has a lot of discussion as to what type of disclosure needs to be provided in what event. So, when we received that publication in preparation for our 10-K filing, we had to have a discussion, where to put it, what is the risk, how are we addressing it, and a conversation like that takes place with the legal department. It takes place even with the HR department, with some of the regulation and protecting data. So, there’s a lot of components that relate to the CFO’s role in order to making sure that we address it properly.

Cindy Ng

I actually wanna go back to all the different departments that are involved in addressing the need for preventing data breaches. How would an organization include that in a conversation if they didn’t have the structure for it?

Guy Melamed

Well, the organization first has to understand where the data resides and who has access to the data. And in a recent survey that we published, approximately 50% of the companies have more than a thousand sensitive files open to everyone in the company. That’s an unheard of number. Think about it. If you have one sensitive file, one file that has the full payroll information for an organization, and that file gets to the wrong hands, that can destroy a company, you have a little more than a thousand sensitive files. So, the risk is very significant and approximately 20% of the data on average is open to everyone in the company. That’s a risk a company must take action against. So, step number one is realize where your risk resides and if you don’t have access, and you don’t know who has access to what type of folder, who’s opening the folder, who’s deleting the folder, then you’re blindsided.

So, I think that’s step number one. There’s additional risks that take place on a day to day, and if I’ve given you an example from the finance department, if an employee is on warning, goes through a PIP, and he has access to sensitive information, you wanna make sure that that information that he has access to stays within the company, and that an employee isn’t accessing more and more information in preparation for departure. So, that’s a risk that relates to the finance organization, but relates to so many other departments as well. There’s IP that, you know, personnel within the R&D department wanna make sure is protected. There’s obviously information related to customers and payroll information and HR and legal and the list just goes on and on. So, the desire is first of all just to be able to know what you need to protect and then who’s protecting it, who has access to it and being able to see any abnormal behavior that’s taking place within an organization.

Don’t We Have an Audit Trail?

Cindy Ng

So, you have deep expertise in risk and some technical knowledge. There was a survey among cybersecurity professionals and 41% of them think that their CFOs have a major gap in their technical expertise in risk or they don’t understand their risk at all. You’ve alluded to some of their risk. What is your recommendation to other CFOs or other individuals who wanna improve their knowledge gap? Who should their trusted advisors be?

Guy Melamed

Well, first of all, I don’t think I have deep expertise on the technological side or in understanding risk, but I have been around enough to understand that the biggest gap between the finance department and the IT or security department has to do with misconceptions. And if you ask, and just to give an example, what we see many times that takes place in our selling process, our selling process, for anyone that doesn’t know, is very visual. So, we can talk about risk with our potential customers but a conversation doesn’t get elevated until customers see how vulnerable they are with their own data, and I guess that’s just human nature. Everyone thinks that they’ll be okay until they see how open and how much data is open to everyone in the company and how many sensitive files could be accessed by people that shouldn’t have access to that.

So, one of the examples that we see during a selling process is that if we sit showing that risk assessment or even having an initial conversation with someone from the IT or a CISO, and also with a legal department member or a finance member, and we ask one simple question, “If today, 10,000 files would have been deleted, would you know about it?” The answer from the CISO or from the IT personnel is, “Absolutely not. We don’t have any ability to know if someone deleted 10,000 files.”

But if you ask a finance person or someone from the legal department or an HR personnel, I think the misconception or their automatic reaction would be that there has to be a way and that it seems unreasonable that a company isn’t tracking if 10,000 files got deleted today. That, I believe is one of the gaps that has to be breached and the education from the finance side is making sure that you know what the company’s tracking and what we’re not tracking and if an employee is about to leave, do we have any type of monitoring to make sure that sensitive files aren’t taken and provided to a competitor or are even used in the future by that, what would be an ex-employee later on.

So, there’s a lot of components on the daily operations. There’s a lot of risks that company has to think about and always kind of go through the process of what can go wrong. Maybe it hasn’t happened and maybe everything is good now and we trust all of our employees, but what if? And I think the notion that when you have organizations with 1,000 employees or 20,000 employees or 50,000 employees, the notion that all of the employees are ethical is a bit scary and you have to think how to protect the company in the best way.

Cost of a Data Breach

Cindy Ng

What’s most compelling for me is that there’s a disconnect between IT and the rest of the departments, where IT thinks that, “I really wanna protect everyone’s data, but there’s no ability to do so.”

Meanwhile, finance, legal, and HR, they think, “Oh, hasn’t that problem been already solved? It’s a little unreasonable,” as you’ve said, “if we weren’t able to figure that out.”

So, let’s talk about the cost of a breach. So, it’s been said that the average cost of a data breach is about four million, and there are many organizations that have paid tens of millions of dollars. What are some direct costs and indirect costs to businesses associated with data breaches?

Guy Melamed

So, a data breach, from a quantifiable perspective, depends on what was taken, when it was taken, who was it taken by, and who was it provided to. So, there is a lot of components, and I think it would be very hard for me to throw out a number. But what I would say is that a breach is a disruption to the business in so many levels. It’s a disruption from the sense of finding out what was taken, the risk of that information being provided to your competitors, even the risk of taking financial information and providing it before it was published.

What I would think about is would a CFO, or a COO for that matter, be comfortable with providing their financial statements to a competitor two weeks before they were published? Obviously the answer is, no, and there could be detrimental consequences to that type of breach.

But the breach isn’t just on the financial information. There is customer information, there is payroll information. There’s just so much sensitive file that sits there that people within the organization have access, and it doesn’t necessarily mean that they would break bad. It could be a situation where someone from the outside took control of the credentials of an employee within the organization and starts using that access in the wrong way. So, the notion, and I think what we’ve seen as a company, as one of the most interesting phenomenas, is that some of the breaches that took place in 2014 really generated a knee jerk reaction and there was a significant IT spent during the beginning of 2015. But that spent at the beginning of the year was mostly towards perimeter defense security. The notion was that if you’re protecting the border, you’ll be okay. And I think what’s been proven day in, day out is that perimeter defense security is absolutely important but the notion that that’s the only type of defense that you need has been thrown out the window.

And if you use the same analogy of border patrol or protecting a country, the fact that you have protection on the border doesn’t mean that you don’t have any other measures and any other organizations that protect you from the inside. Because at one point there is gonna be someone that will be able to overcome that border. Not only that, how are you protecting your organization or your country from people from the inside? So, what we’ve seen in the last couple years is that the amount of breaches that have taken place have increased significantly. The magnitude has increased significantly, the implications on those companies has increased significantly.

And I know there was an article a couple years ago that discussed the cost of a breach and how you shouldn’t buy any software and you can just deal with a breach. That notion has been thrown out the window and, you know, it’s obviously that the consequences of a breach that we see it on the news and on the front page of “The Wall Street Journal” and “The Financial Times.” It’s happening in rates that we haven’t seen before and I don’t see that going away.


Continue reading the next post in "[Podcast] Varonis CFO & COO Guy Melamed: Preventing Data Breaches and Reducing Risk"

[Podcast] Dr. Wolter Pieters on Information Ethics, Part Two

[Podcast] Dr. Wolter Pieters on Information Ethics, Part Two


Leave a review for our podcast & we'll send you a pack of infosec cards.

In part two of my interview with Delft University of Technology’s assistant professor of cyber risk, Dr. Wolter Pieters, we continue our discussion on transparency versus secrecy in security.

We also cover ways organizations can present themselves as trustworthy. How? Be very clear about managing expectations. Declare your principles so that end users can trust that you’ll be executing by the principles you advocate. Lastly, have a plan for know what to do when something goes wrong.

And of course there’s a caveat, Wolter reminds us that there’s also a very important place in this world for ethical hackers. Why? Not all security issues can be solved during the design stage.

Transparency versus Secrecy

Wolter Pieters

My name is Wolter Pieters. I have a background in both computer science and philosophy of technology. I’m very much interested in studying cyber security from an angle that either goes a bit more towards the social science, so, why do people behave in certain ways in the cyber security space. But also more towards philosophy and ethics, so, what would be reasons for doing things differently in order to support certain values.

Privacy, but then again, I think privacy is a bit overrated. This is really about power balance. It’s because everything we do in security will give some people access and exclude other people, and that’s a very fundamental thing. It’s basically about power balance that is through security we embed into technology. And that is what fundamentally interests me in relation to security and ethics.

Cindy Ng

How do we live in now world where you just don’t know whether or not organizations or governments are behaving in a way that’s trustworthy?

Wolter Pieters

You know, transparency versus secrecy is a very important debate within the security space. This already starts out very fundamentally from the question like, “Should methods for protecting information be publicly known or should they be kept secret because otherwise we may be giving too much information away to hackers, etc?” So, this is a very fundamental thing and in terms of encryption already, there’s the principle like, “Hey, encryption algorithms should be publicly known because otherwise we can’t even tell how well our information is being protected by means of that encryption and only the keys using encryption should be kept secret.” This is a principle called Kerckhoff’s Principle. This is very old and information in security and a lot of the current encryption algorithms actually adhere to that principle and we’ve also seen encryption algorithms not adhering to that principle.

So, algorithms that were secrets, trade secrets, etc. being broken very moments the algorithm became known. So, in that sense there I think most researchers would agree this is good practice. On the other hand it’s seems that there’s also a certain limit to what we want to be transparent there. Both in terms of security controls, we’re not giving away every single thing governments do in terms of security online. So, there is some level of security by obscurity there and more generally to what extent is transparency a good thing. This again ties in with who is a threat. I mean, we have the whole WikiLeaks endeavor and some people will say, “Well, this is great. The government shouldn’t be keeping all that stuff secret.” So, it’s great for trust that this is now all out in the open. On the other hand, you could argue all this and this is actually a threat to trust in the government. So, this form of transparency would be very bad for trust.

So, there’s clearly a tension there. Some level of transparency may help people trust in the protections embedded in the technology and in the actors that use those technologies online. But on the other hand, if there’s too much transparency all the nitty-gritty details may actually decrease trust. You see this all over the place. We’ve seen it through with the electronic voting as well. If you provide some level of explanation on how certain technologies are being secured, that may help. If you provide too much detail people won’t understand it and it will only increase distrust. There is a kind of golden middle there in terms of how much explanation you should give to make people trust in certain forms of security encryption, etc. And again, in the end people will have to rely on experts because physical forms of security, physical ballot boxes, it’s possible to explain and how these work and how they are being secured with digital that becomes much more complicated and for most people, they will have to trust the judgment of experts that these forms of security are actually good if the experts believe so.

What Trustworthy Organizations Do Differently

Cindy Ng

What’s something an organization can do in order to establish themselves as a trustworthy, morally-sound, ethical organization?

Wolter Pieters

I think the most important thing that companies can do is very clear in terms of managing expectations. So, couple of examples there, if as a company you decide to provide end-to-end encryption for communications. The people that use your or your jet app exchange messages get the assurance that the messages are encrypted between their device and the device of the one that they’re communicating with. And this is a clear statement like, “Hey, we’re doing it this way.” And that also means that then you shouldn’t have any backdoors or means to give this communication away to need the intelligence agencies anyway. Because if this is your standpoint, and people need to be able to trust in that. Similarly, if you are running a social network site and you want people to trust in your policies then you need to be crystal clear.

Not only that it’s possible to change your privacy settings, to regulate the access that other use of the social networking servers have to your data, but at the same time you need to be crystal clear about how you as a social network operator are using the kind of data. Because sometimes I get the big internet companies are offering all kinds of privacy settings which give people the impression that they can do a lot in terms of their privacy but, yes, this is true for the inter user data access but the provider still sees everything. This seems to be a way of framing privacy in terms of inter user data access. Whereas, I think it’s much more fundamental what these companies can do with all the data they gather for all their use and what that means in terms of their power and the position that they get in this whole area of cyberspace this whole arena.

So, managing expectations, I mean, there’s all kinds of different standpoints also based on different ethical theories, based on different political points of view that you could take in this space. If you want to behave ethically then make sure you list your principles, you list what you do in terms of security and privacy to adhere to those principles and make sure that people can actually trust that this is also what you do in practice. And also make sure that you know exactly what you’re going to do in case something goes wrong anyway. We’ve seen too many breaches where the responses by the companies were not quite up to standards in terms of delaying the announcement of the breach or it’s crucial to not only do some prevention in terms of security and privacy but also know what you’re going to do in case something goes wrong.

Doomsday Scenarios

Cindy Ng

Yeah, you say, if an IoT device gets created and they get to market their product first and then they’ll fix security and privacy later, that’s too late. Is it sort of like, “We’re doomed already and we’re just sort of managing the best way we know how?”

Wolter Pieters

In a way, it’s a good thing when we are nervous about where our society is going because in history at moments where people weren’t nervous enough about where society was going, we’ve seen things go terribly wrong. So, in a sense we need to get rid of the illusion that we can easily be in control or something like that because we can’t.

The same for elections, there is no neutral space from which people can cast their vote without being influenced and we’ve seen in recent elections that actually technology is playing more and more of a role in how people perceive political parties and how to make decisions in terms of voting. So, it’s inevitable that technology companies have a role in those elections and that’s also what they need to acknowledge.

And then of course, and I think this is a big question that needs to be asked, “Can we prevent the situation in which the power of certain online stakeholders whether those are companies or are there for a nation state or whatever. Can we prevent a situation in which they get so much power that they are able to influence our governments, either through elections or through other means?” That’s a situation that we really don’t want to be in and I’m not pretending that I have a crystal clear answers there but this is something that at least we should consider as a possible scenario.

And then there’s all these doomsday scenarios with Cyber Pearl Harbor and I’m not sure whether these doomsday scenarios are the best way to think about this but we should also not be naive and think that all of this will blow over because maybe indeed we have already been giving away too much power in a sense. So, what we should do is fundamentally rethink the way we think about security and privacy from, “Oh, damn, my photos are I don’t know whatever, in the hands of whoever.” That’s not the point. It’s about the scale in which the certain actors either get their hands on data or lots of individuals are able to influence lots of individuals. So, again scale comes in there. It’s not about our individual privacy, it’s about the power that these stakeholders get by having access to the data over by being able to influence lots and lots of people and that’s what the debate needs to be about.

Cindy Ng

Whoever has the data has power, is what you’re getting at.

Wolter Pieters

Whoever has the data and in a sense that data can then, again, be used also to influence people in a targeted way. If you know that somebody’s interested in something, you can try to influence their behavior by referring to the thing that they’re interested in.

Cindy Ng

That’s only if you have data integrity.

Wolter Pieters

Yes. Yes, of course. But on the other hand, little bit of noise in the data doesn’t matter too much because if you have data that’s more or less correct, you can still achieve a lot.

Ethical Hackers Have An Important Role

Cindy Ng

Anything that I didn’t touch upon that you think is important for our listeners to know?

Wolter Pieters

The one thing that I think is critically important is the role that ethical hackers can have in keeping people alert, in a way maybe even changing the rules of the game, because in the end I also don’t think that all security issues can be solved in the design of technology and it’s critically important that when technology are being deployed that people keep an eye on issues that may have been overlooked in the design stage of those technologies. We need some people that are paying attention and will be alerting us to issues that may emerge.

Cindy Ng

It’s a scary role to be in though if you’re an ethical hacker because what if the government comes around and accuses you being an unethical hacker?

Wolter Pieters: Yeah. I think that’s an issue but if that’s going to be happening, if people are afraid to play this role because legislation doesn’t protect them enough, then maybe we need to do something about that. If we don’t have people that point us to essential weaknesses in security, then what will happen is that those issues will be kept secret and that they will be misused in ways that we don’t know about and I think that’s much worse situation to be in.


What Experts Are Saying About GDPR

What Experts Are Saying About GDPR

You did get the the memo that GDPR goes into effect next month?

Good! This new EU regulation has a few nuances and uncertainties that will generate more questions than answers over the coming months. Fortunately, we’ve spoken to many attorneys with deep expertise in GDPR. To help you untangle GDPR, the IOS staff reviewed the old transcripts of our conversations, and pulled out a few nuggets that we think will help you get ready.

Does the GDPR cover US businesses? Is the 72-hour breach notification rule strict? Do you need a DPO?  We have the answers below!  If you have more time, listen to our podcasts for deeper insights.

Privacy By Design Raised the Bar

Inside Out Security: Tell us about GDPR, and its implications on Privacy by Design.

Dr. Ann Cavoukian: For the first time, right now the EU has the General Data Protection Regulation, which passed for the first time, ever. It has the words, the actual words, “Privacy by Design” and “Privacy as the default” in the stature.

What I tell people everywhere that I go to speak is that if you follow the principles of Privacy by Design, which in itself raised the bar dramatically from most legislation, you will virtually be assured of complying with your regulations, whatever jurisdiction you’re in.

Because you’re following the highest level of protection. So that’s another attractive feature about Privacy by Design is it offers such a high level of protection that you’re virtually assured of regulatory compliance, whatever jurisdiction you’re in.


Leave a review for our podcast & we'll send you a pack of infosec cards.

US Businesses Also Need To Prepare for GDPR

Inside Out Security: What are some of the concerns you’re hearing from your clients on GDPR?

Sue Foster: When I speak to my U.S. clients, if they’re a non-resident company that promotes goods or services in the EU, including free services like a free app, for example, they’ll be subject to the GDPR. That’s very clear.

Also, if a non-resident company is monitoring the behavior of people who are located in the EU, including tracking and profiling people based on their internet or device usage, or making automated decisions about people based on their personal data, the company is subject to the GDPR.


Leave a review for our podcast & we'll send you a pack of infosec cards.

Is the 72-hour rule as strict as it sounds?

Inside Out Security:  What we’re hearing from our customers is that the 72-hour breach rule for reporting is a concern. And our customers are confused and after looking at some of the fine print, we are as well!! So I’m wondering if you could explain the breach reporting in terms of thresholds, what needs to happen before a report is made to the DBA’s and consumers?

Sue Foster: So you have to report the breach to the Data Protection Authority as soon as possible, and where feasible, no later than 72 hours after becoming aware of the breach.

How do I know if a breach is likely to ‘result in a risk to the rights and freedoms of natural persons’?

There is actually a document you can look at to tell you what these rights and freedoms are. But you can think of it basically in common sense terms. Are the person’s privacy rights affected, are their rights and the integrity of their communications affected, or is their property affected?

If you decide that you’re not going to report after you go through this full analysis and the DPA disagrees with you, now you’re running the risk of a fine to 2% of the group’s global turnover …or gross revenue around the world.

But for now, and I think for the foreseeable future, it’s going to be about showing your work, making sure you’ve engaged, and that you’ve documented your engagement, so that if something does go wrong, at least you can show what you did.


Leave a review for our podcast & we'll send you a pack of infosec cards.

What To Do When You Discover A Breach

Inside Out Security: What are one the most important things you would do when you discover a breach? I mean if you could prioritize it in any way. How would you advise a customer about how to have a breach response program in a GDPR context?

Sheila FitzPatrick: Yeah. Well first and foremost, you do need to have in place, before a breach even occurs, an incident response team that’s not made up of just the IT. Because normally organizations have an IT focus. You need to have a response team that includes IT, your chief privacy officer. And if the person… normally a CPO would sit in legal. If he doesn’t sit in legally, you want a legal representative in there as well. You need someone from PR, communications that can actually be the public-facing voice for the company. You need to have someone within Finance and Risk Management that sits on there.

So the first thing to do is to make sure you have that group in place that goes into action immediately. Secondly, you need to determine what data has potentially been breached, even if it hasn’t. Because under GDPR, it’s not… previously it’s been if there’s definitely been a breach that can harm an individual. The definition is if it’s likely to affect an individual. That’s totally different than if the individual could be harmed. So you need to determine okay, what data has been breached, and does it impact an individual?

So, as opposed to if company-related information was breached, there’s a different process you go through. Individual employee or customer data has been breached, the individual, is it likely to affect them? So that’s pretty much anything. That’s a very broad definition. If someone gets a hold of their email address, yes, that could affect them. Someone could email them who is not authorized to email them.

So, you have to launch into that investigation right away and then classify the data that has been any intrusion into the data, what that data is classified as.

Is it personal data?

Is it personal sensitive data?

And then rank it based on is it likely to affect an individual?

Is it likely to impact an individual? Is it likely to harm an individual?

So there could be three levels.

Based on that, what kind of notification? So if it’s likely to affect or impact an individual, you would have to let them know. If it’s likely to harm an individual, you absolutely have to let them know and the data protection authorities know.


Leave a review for our podcast & we'll send you a pack of infosec cards.

Do we need to hire a DPO?

Inside Out Security: An organization must appoint a data protection officer (“DPO”) if, among other things, “the core activities” of the organization require “regular and systematic monitoring of data subjects on a large scale.”  Many Varonis customers are in the B2B space, where they do not directly market to consumers. Their customer lists are perhaps in the tens of thousands of recipients up to the lower six-figure range. First, does the GDPR apply to personal data collected from individuals in a B2B context? And second, how when does data processing become sufficiency “large scale” to require the appointment of a DPO?

Bret Cohen and Sian Rudgard with Hogan Lovells: Yes, the GDPR applies to personal data collected from individuals in a B2B context (e.g., business contacts).  The GDPR’s DPO requirement, however, is not invoked through the maintenance of customer databases.

The DPO requirement is triggered when the core activities of an organization involve regular and systematic monitoring of data subjects on a large scale, or the core activities consist of large scale processing of special categories of data (which includes data relating to health, sex life or sexual orientation, racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, or biometric or genetic data).

“Monitoring” requires an ongoing tracking of the behaviors, personal characteristics, or movements of individuals, such that the controller can ascertain additional details about those individuals that it would not have known through the discrete collection of information.

Therefore, from what we understand of Varonis’ customers’ activities, it is unlikely that a DPO will be required, although this is another area on which we can expect to see guidance from the DPAs, particularly in the European Member States where having a DPO is an existing requirement (such as Germany).

Whether or not a company is required to appoint a DPO, if the company will be subject to the GDPR, it will still need to be able to comply with the “Accountability” record-keeping requirements of the Regulation and demonstrate how it meets the required standards. This will involve designating a responsible person or team to put in place and maintain appropriate  policies and procedures , including data privacy training programs.


[Podcast] I’m Elena Khasanova, Professional Services Manager at Varonis, ...

[Podcast] I’m Elena Khasanova, Professional Services Manager at Varonis, and This is How I Work


Leave a review for our podcast & we'll send you a pack of infosec cards.

Prior to Varonis, Elena Khasanova worked in backend IT for large organizations. She did a bit of coding, database administration, project management, but was ready for more responsibility and challenges.

So seven years ago, she made the move to New York City from Madison, Wisconsin and joined the professional services department at Varonis.

With limited experience speaking with external customers and basic training, Varonis entrusted her to deploy products as well as present to customers. Elena recalls, “Not every company will give you a chance to talk to external customers without prior experience….But it was Varonis that gave me that chance.”

According to her manager, Ken Spinner:

Over the last 6 years, I’ve had the pleasure of working with Elena, first as a coworker in different departments, and most recently as the leader of our Remediation Team in our Professional Services department.  Elena was uniquely qualified to lead the team as she had significant experience performing project management prior to planning and completing our first remediation projects.  Elena’s knowledge was instrumental in defining the essence of the Varonis  Data Risk Assessment, the process used by PS to perform remediation, as well as providing practical insight to Engineering during the development of the Automation Engine.

Read on to learn more about Elena – this time, in her own words.

What would people never guess you do in your role?

Not only am I involved in professional services, I also spend a lot of time on sales calls.

What did you learn about yourself after working at Varonis?

I am pretty good at selling concepts and ideas.

How has Varonis helped you in your career development?

Prior to Varonis, I only worked in internal IT. Varonis gave me a chance to work with external customers and exposed me to sales and product management.

What advice do you have for prospective candidates?

Pour your heart and soul into Varonis products. If you are smart and hard-working, it will be noticed right away.

What do you like most about the company? 

Despite being a publicly traded company, it kept its startup spirit and passion.

What’s the biggest data security problem your customers/prospects are faced with?

Company files are often accessible by every employee regardless of their roles. How can we fix that without someone losing access to work they really need access to?

What certificates do you have?


What is your favorite book?

Big Magic by Elizabeth Gilbert

What is your favorite time hack?

I assign values in my to-do list by urgency: important (not always urgent but is important in the long run), speed and reluctance.

Things I’m most reluctant to do, I try to do in the beginning of the day when my willpower is still high.

What’s your favorite quote?

“It would not be much of a universe if it wasn’t home to the people you love.”

– by the greatest scientist, Stephen Hawking


Elena Khasanova: My name is Elena Khasanova, and I’m currently a Professional Services Manager at Varonis. And this is how I work.

Cindy Ng: How long have you been working at Varonis?

Elena Khasanova: It will be seven years in June. So, pretty long time.

Cindy Ng: And what was your background prior to working at Varonis?

Elena Khasanova: Prior to Varonis, I worked for fairly large-sized organizations. I would say, on the customer side. So I worked IT in back-end IT. And my customers were the internal teams, internal departments, and my coworkers. I worked in both a very technical disciplines, I did some coding, I did some database administration, and then I switched towards project management and geared more and more towards IT security area of overall IT. And then, I ended up taking a bunch of the Microsoft certificates and CISSP certifications. So I became a bit of an expert on projects within IT security industry.

Cindy Ng: And you’ve gotten certificates, you’ve held many different roles. How did you end up deciding that Varonis was a good fit for you?

Elena Khasanova: So, at that point, prior to Varonis, I actually lived in Wisconsin for eight years. And Madison, Wisconsin, to be precise. And that town or city was getting pretty small for me, and I wanted to work for a smaller company and also the one that sells IT as a service. So I wanted to be on the front-end of the revenue. Not quite in sales, but yet working with external customers.

You know, not every company will give you a chance to go and talk to external customers without prior experience. There is more on the line when working with external customers.

While working with internal customers, of course, you need to deliver to the highest level of satisfaction, but nevertheless, it’s your coworkers. If you serve HR team as an IT team, HR team does not have an ability to go to another vendor, right, to another IT team. You don’t compete with another IT team. So you kind of have this internal monopoly in delivering the services.

If you work with external customers, the external customers, it’s real revenue on the line, so it’s not just internal transfer of, you know, company budgets between departments. It’s real money on the line that do support the company and the shareholders. Customers, of course, always have a choice to go with somebody else.

So, I think there’s so much more at stake when dealing with external customers. Then, you know, the risks, and therefore, the rewards are so much more elevated.

So, one day, a recruiter from New York City called me and said, “Oh, this is a little bit of a more technical position, but this is exactly what you wanted, so give it a try.” And I had an interview over the phone, and they flew me in. And I was amazed that Varonis, immediately after a brief training, trusted me enough to go and deploy our product and train our customers and interact with very large companies on behalf of Varonis.

Varonis gave me a chance, and I really, actually, enjoyed it. And I think I, a bit surprised, even myself, that how much fun I had. Right now I can put on my resume very expansive experience working with external customers, but it was Varonis that gave me that chance and exposed me to this area. I mean, I loved it from day one. Like I said, it’s been almost seven years now and I’m still here and still loving it.

Cindy Ng: That’s great. So, you’re involved in professional services at Varonis. What does professional services entail, and what was the catalyst to create the professional services department at Varonis?

Elena Khasanova: Right now, professional services perform such a wide variety of tasks. It was not always the case. It started as a supplement to the support department. Varonis support department existed from the very beginning of the company’s existence, and if customers had issues, they would call or email support team and deal with them. However, at some point, it was clear that it was not sufficient to just give the customer instructions on how to install the products and then have them deal with support as needed to be.

So, the first person was hired to create, you know, one team, professional services department that rapidly grew, based on the customers’ needs, into the team specializing in the initial installation and training for the customers. Lately, other technical tasks were added, such as upgrades or migrations as the customers needed to move from one service to another.

At some point, the customers asked us to do more reporting on the issues within the environment. The issues with data permissions. So that was added to the list of tasks that professional services performed.

And then later, the customers started asking us to not only report on the issues, but actually fix them. And that’s when the remediation services branch was born.

So, it was very organic growth. It was very much driven by customer demands. And as well, it was driven by our customers becoming larger and bigger enterprise companies. And as we’ve had more and more international companies around the world, there was a need to provide more than just installation services, but, you know, do project management, as well, and do business analysis and other things, as well.

So, at this point, we do anything from simple installation to very large wide-scale rollouts around the world, as well as perform even multi-year engagements, very wide variety of projects and engagements. And some of them could be very large.

Cindy Ng: So professional services is wide in scope. So, are you engaged with other teams within Varonis to coordinate?

Elena Khasanova: As we collect more and more feedback from the field, and as professional services department itself is reaching, I believe, seven years of age, we interact more and more with other teams to make sure that the feedback from the field goes back into, for example, product management. Product management is one of our biggest collaborators here in Varonis. So, we provide feedback from the customers as well as feedback from professional services on how to make the product more stable, more customer-friendly, more user-friendly, and to shape the future of the product.

Other teams are sales, of course. I personally spend up to 30% of my time on sales calls, because some of the remediation engagements are fairly large and complex and it helps to have somebody on the call to talk to the customer about the best practices, the pitfalls to avoid, and so on and so forth. It’s just, for sales, it’s hard to have that level of experience and interactions from previous projects, so it helps to have a professional services representative on a call or we even go on-site to help sales close the deal.

Cindy Ng: How would you break down your day?

Elena Khasanova: So, it’s about 25% with sales, about 15% with product management and marketing, I think, and that left 60% of pure project management with professional services.

Cindy Ng: The term “remediation” is thrown around a lot. What exactly is it…?

Elena Khasanova: From Varonis’ perspective, it’s remediation of data access. The reason why companies need it is that at almost every external breach, at some point, becomes an internal breach. So, companies surround themselves with heavy layers of firewalls, and they secure their perimeter as best as they can. However, with so many companies outsourcing and subcontracting so many IT activities to other vendors, as well as those vendors, in turn, subcontracted to more and more vendors, it’s almost impossible to fully protect internal data via a firewall.

Once somebody, whether it’s a malicious employee or malware, can get inside the company and they start accessing data they shouldn’t be accessing, that breach spreads internally like water to a Titanic compartments. And this is why companies need to secure the internal data in addition to securing the perimeter.

And this is our term “remediation” that we use, it’s to secure data within those compartments inside the company. So if there is a breach, that breach will be limited to that compartment and will not, say, impact the operational areas of the company and…

Cindy Ng: And, how would customers know if their internal permissions are overexposed? So, for instance, if all finance folders are open to everyone in the organization, there’s a huge disconnect. So I think C-levels, they think that, and correct me if I’m wrong, that they probably already think that problems have already been corrected and it’s not really an issue because if we’ve been able to create so much amazing technology that that problem is no longer an issue. Meanwhile, IT is like, you know, If you do it manually, if you try to fix global group access, that is a very hard problem to fix.

Elena Khasanova: So, you’re very correct on this one, that not only it’s hard to fix, but many customers are not even aware that there is a problem in the environment to begin with. We actually call it “turning the light in the basement to discover dead bodies there” because most native operating systems do not actually provide any interface or any information on the exposure of data internally. And it’s actually really tough without a specialized product such as Varonis to even see that unless you’re thinking there is a problem and you actually create some kind of scripts to, you know, scan your internal permissions. But if you don’t know there is a problem, you don’t know what you don’t know, so you don’t even think about that, so you’re looking at other areas.

I love the shocking look on customers’ face when we perform our risk assessment. So we can really quickly scan the environment, and within 48 hours produce amazing wealth of information to the customers. And I love going to customers’ meetings and showing them that, “Look, 40% of your data is exposed to everyone in the company. Oh, and you have 20,000 accounts in your company, and any one of these accounts can access 40% of your data. Did you know that?”

We even ask the customers to write the numbers on a paper and just kind of as a bet. Not a bet, maybe, but just to, again, to show the gap between the perception and the reality that, “Hey, how many photos do you think are exposed to everyone?” And the customers, I mean, virtually every time would say, I don’t know, 4, 5, maybe 10, and 50. And then we run the scan and turns out it’s 50,000.

Cindy Ng: What about prospects, that think that they can write a script?

Elena Khasanova: Well, so a script, technically, maybe could be written, but because our product is optimized for this and we’ve been doing it for now more than 10 years, we will scan even a very large environment within matter of 48 hours. While a manual script, it could take it months to go through each and every folder and collect those permissions, and then report them in a meaningful manner.

We have a beautiful interface, we can immediately, you know, show you the pie charts, and all the graphics, and trends. And it’s really beautiful and you can click around and play around with it. You can drill down, come back out, and so on and so forth. Even if you write a script, and it takes that script months to scan the permissions, it will be just a huge text log file that will be really hard to differentiate between different departments, different company branches. Again, drill down, come back out of it, and so on and so forth.

So we’re just giving you such a good interface and everything at a glance, and our product re-scans it constantly, so if there are any changes, permissions get changed, we will immediately produce a snapshot of that.

Cindy Ng: Let’s go back to the finance folder that’s open to everyone. So, if I remove that ability where the finance folders are removed, the global group access, which steps if any, do I need to take?

Elena Khasanova: So, removing access from everyone is an excellent first step. And how we do it is we look at the activity in the back-end. And if an account, whether it’s individual account or a service account, had any activity, we will keep that account access while disconnecting the Everyone group. In many cases, that is enough. However, if it’s true sensitive data and, you know, I would argue that all the data is sensitive, right, if it comes to that. However, the software itself cannot differentiate between legitimate and illegitimate access and only human eyes can.

We’ve had situations where we had somebody running a so-called crawler from their workstation and that software would just go and hit every server open and try to scan what’s inside. As a result of it, it produced a lot of activity. So when we performed automatic removal of the group Everyone, it kept that account because the account had activity.

So, with the next step of the access certification that is performed by a human data owner, we would have never been able to actually expose that account and say that, “Look, why is this account in this group? Why did it have any activity to begin with? It should have never been able to perform that.”

So the second step of a human access re-certification, which we call entitlement review, really takes down the access to the state of list privilege. So you need both steps. One is automatic software removal and then human entitlement review.

Cindy Ng: If you’re a company, you have data and you feel like the data keeps growing and there’s not enough IT staff, what are your recommendations for that company?

Elena Khasanova: We recommend handing off a lot of that responsibility and accountability on who should be accessing data to so-called data owners. Those are the people from the business side who know what type of data is inside and who should be having access to it. For it’s not feasible for the IT department to know, especially in mid and large-sized companies, to know exactly who should be having access to each folder. “Should Mary be having access to finance, stocks or bonds folder or not? Or, maybe, Mary, you know, moved positions or maybe now current job description changed slightly and she no longer needs to have that access.” Only business owners know that.

And so, our recommendation is… And it seems like more and more customers not only buy into that, but actually come to us asking to implement this. So, our recommendation is to define data owners for all of your data, even outside of IT, and then implement some kind of process, very user-friendly process that allows data owners, within minutes, to make decisions on who should and should not be having access to the data.

Cindy Ng: And when it comes to stale data, what are your recommendations?

Elena Khasanova: With stale data, we try to follow company policies. First of all, as long as you’re actually doing something with stale data, you’re already doing well. The potential actions that could be applied to stale data are data quarantine. So we strip out all the permissions and maybe, potentially, only storage team or only internal audit has access to it. You can move it into cheaper storage. So, maybe it will be a bit slower for your customers and applications, but nobody’s accessing it anyway, so you’re just keeping it there. And it costs less for the company. Or you just delete it.

Cindy Ng: So, but if you’re in an organization that doesn’t care about the cost of storage, why would it be worth figuring out what to do with stale data?

Elena Khasanova: Cost of storage is only one of the aspects that comes into decisions of what to do with stale data. And, by the way, when you think about that, the cost of storage is not just active data that sits on a disk, but also the countless backups, and often, data duplication that also costs the company money.

However, let’s say that’s not a factor. Stale data could be an area of risk to the company because it creates liability. If you remember a case with Sony when emails from more than 10 years ago resurfaced with Angelina Jolie being, you know, called some names, and that cost the company a loss of reputation as well as the actress will never now do business with Sony, those are the liability issues that should drive a company to deleting the stale data the moment it passed the regulation period of retaining it.

You just simply don’t know what’s stored in there. And if there’s a lawsuit and you’re subpoenaed, you will have to present all the data stored on your company’s servers. Now, if it was deleted, and then you cannot and you don’t have to actually present it. Of course, you have to retain it, I’m sure your legal department knows for X number of months or years, but past that period, too many companies just continue keeping that data, and again, it just represents a liability issue from that perspective.

Cindy Ng: GDPR is coming up and I’m wondering if you’re getting any questions about that.

Elena Khasanova: GDPR is an interesting regulation because in U.S., many companies still don’t think they fall under its jurisdiction. GDPR covers any company that has at least one customer or one employee with a EU citizenship.

So, I’ve been to the meetings with customers, in U.S., where we would bring up GDPR and we would hear back that, “Oh, no, we are not impacted. We don’t have any customers or employees that have that citizenship.” And then a person, literally in the meeting, like, just raised his hand and said, “I have a dual passport. I am an EU citizen.”

So, it was just kind of very funny to see that, but at the same time, I think we need to do more awareness training to demonstrate to U.S. companies that they simply may not know, but it’s actually very likely. If you are a large U.S. company, even if you don’t do business specifically in the European Union, if you are big enough, it’s very likely that one of your employees or customers does have that citizenship, and then you will fall under the umbrella coverage of GDPR.

Also, it’s very important to know that how strict GDPR is and how severe the penalties are. And, I think, again, while in Europe, many companies scramble to make sure that they are ready for the rollout in May, but U.S. companies, for them, it’s still something on a back burner or something that they’re not actively thinking about. So, it’d be interesting to see this transition.

Cindy Ng: You’ve been working in the InfoSec space for a long time. A lot of people say that InfoSec is really just about compliance and dismiss the potential value of security.

Elena Khasanova: What we see is more and more companies that do not consider themselves IT companies do actually become IT companies in a way that so much of their business is built on technology, working well, being stable, you know without having downtimes and so on and so forth. And then, with that comes security. I mean, you hear a lot now about Facebook and how well or not well they actually kept customers’ data. So, that makes more and more companies on business-level, on CEO levels think about, “Okay, how is my IT working? And how, do we have backups? Do we have some redundant network that if outage happens that the airline will continue to function? And how secure is that with so many companies being hit with data security breaches recently?”
And finally, we see CEOs actually losing jobs because of that.

I think the role of InfoSec being so elevated, and now, InfoSec representatives participate in very high-level business discussions, because, otherwise, if you ignore that, you will have impact to your core business.

Cindy Ng: So it sounds like you’ve been keeping really busy with lots of data breaches and fixing global group access. Outside of work, what do you enjoy doing?

Elena Khasanova: I, jokingly, actually say on my LinkedIn profile if somebody just reminded me that I bore people to death at parties with the security data, security conversations. And maybe, yes, that does happen. So I also love to travel around the world. And I used to travel a lot more before I had the kids, but we are trying to involve our kids. Well, I have one, a toddler, 16 year old, 16 months old, sorry, and one on the way. Very little, but she already has her global entry and she’s been on, I think, 12 flights now. So, yes, we actually had to take her to airports for clearance interview with immigration officials and she did very well. So she can now bypass the immigration lines on the entry to the country. And so yeah, we hope to kind of maintain, somewhat, our travel lifestyle.

I used to travel so much, I actually ran out of international, all the space in the passport within two years of getting it and I had to send it back to the officials to add more pages.

Cindy Ng: What were the last three places you went to?

Elena Khasanova: So, the last three places. We just went to Bermuda with the whole family. And before that, we went to Guadeloupe, which is French West Indies that were not on my radar until “Wall Street Journal” ran an article about it. And Norwegian Airlines actually opened a direct flight there. I highly recommend it. It’s one of those areas that doesn’t have any five-star resort hotels, but it’s a great place to go and in four hours, from New York City anyways, you’ll be there. And prior to that, it was probably… Oh, it was Scotland. Also, actually, a new country for me, but I went there for work. Every time I go for work, I do try to sneak out and find some few hours to at least go, maybe, for a walk or hike or explore a castle like I did in Scotland. So that was a lot of fun.

Cindy Ng: Oh, beautiful. Yeah, I hear the landscapes there are gorgeous. Sounds like you’ve had a very rewarding career thus far, and I wish you much success.

Elena Khasanova: Thank you very much.

Support for the Inside Out Security Show and the following message come from Varonis. A Varonis Data Risk Assessment doesn’t take long. A 90 min software install lets you map access to your directory services, classify files to discover what’s sensitive, and start monitoring and analyzing user behavior. If you want to turn on the lights Varonis can help. Visit info.varonis.com/podcast and get a free data risk assessment.

[Podcast] Dr. Wolter Pieters on Information Ethics, Part One

[Podcast] Dr. Wolter Pieters on Information Ethics, Part One


Leave a review for our podcast & we'll send you a pack of infosec cards.

In part one of my interview with Delft University of Technology’s assistant professor of cyber risk, Dr. Wolter Pieters, we learn about the fundamentals of ethics as it relates to new technology, starting with the trolley problem. A thought experiment on ethics, it’s an important lesson in the world of self-driving cars and the course of action the computer on wheels would have to take when faced with potential life threatening consequences.

Wolter also takes us through a thought track on the potential of power imbalances when some stakeholders have a lot more access to information than others. That led us to think, is technology morally neutral? Where and when does one’s duty to prevent misuse begin and end?


Wolter Pieters: My name is Wolter Pieters. I have a background in both computer science and philosophy of technology. I’m very much interested in studying cyber security from an angle that either goes a bit more towards the social science, so, why do people behave in certain ways in the cyber security space. But also more towards philosophy and ethics, so, what would be reasons for doing things differently in order to support certain values.

Privacy, but then again, I think privacy is a bit overrated. This is really about power balance. It’s because everything we do in security will give some people access and exclude other people, and that’s a very fundamental thing. It’s basically about power balance that is through security we embed into technology. And that is what fundamentally interests me in relation to security and ethics.

Cindy Ng: Let’s go back first and start with philosophical, ethical, and moral terminology. The trolley problem: it’s where you’re presented two dilemmas, where you’re the conductor and you see the trolley is going down a track and it has the potential to kill five people. But then if you pull a lever, you can make the trolley go on the other track where it would kill one person. And that really is about: what is the most ethical choice and what does ethics mean?

Wolter Pieters: Right. So, ethics generally deals with protecting values. And values, basically, refer to things that we believe are worthy of protection. So, those can be anything from health, privacy, biodiversity. And then it’s said that some values can be fundamental, others can be instrumental in the sense that they only help to support other values, but they’re not intrinsically worth something in and of themselves.

Ethics aims to come up with rules, guidelines, principles that help us support those values in what we do. You can do this in different ways. You can try to look only at the consequences of your actions. And in this case, clearly, in relation to the trolley problem, it’s better to kill one person than to kill five. If you simply do the calculation, you know, you could say, “Well, I pull the switch and thereby reduce the total consequences.” But you could also argue that certain rules state like you shall not kill someone, which would be violated in case you pull the switch. I mean, if you don’t do something, then five people would be killed. Then you don’t do something explicitly, whereas you would pull the switch you would explicitly kill someone. And from that angle, you could argue that you should not pull the switch.

So, this is very briefly an outline of different ways in which you could reason about what actions would be appropriate in order to support certain values, in this case, life and death. Now, this trolley problem is these days often cited in relation to self-driving cars, which also would have to make decisions about courses of action, trying to minimize certain consequences, etc. So, that’s why this has become very prominent in the ethics space.

Cindy Ng: So, you’ve talked about a power in balance. Can you elaborate on and provide an example on what that means?

Wolter Pieters: What we see in cyberspace is that there are all kinds of actors, stakeholders that gather lots of information. There’s governments being interested in doing types of surveillance in order to catch the terrorist amongst the innocent data traffic. There is content providers that give us all kinds of nice services, but at the same time, we pay with our data, and they make profiles out of it and offers targeted advertisements and, etc. And at some point, some companies may be able to do better predictions than even our governments can do. So, what does that mean? In the Netherlands, today actually, there’s a referendum regarding new powers for the intelligence agencies to do types of surveillance online, so there’s a lot of discussion about that.

So, on the one hand, we all agree that we should try to prevent terrorism, etc. On the other hand, this is also a relatively easy argument to claim access to data, they’re like, “Hey, we can’t allow these terrorists attacks, so we need all your data.” It’s very political. And this also makes it possible to kind of leverage security as an argument to claim access to all kinds of things.

Cindy Ng: I’ve been drawn to ethics and the dilemma of our technology, and because I work at a data security company, you learn about privacy regulations, GDPR, HIPAA, SOX compliance. And at the core, they are about ethics and a moral standard of behavior. And can you address the tension between ethics and technology?

And the best thing I read lately was Bloomberg’s subhead that said that ethics don’t scale. When ethics is such a core value, but at the same time, technology is sort of what drives economies, and then add an element of a government to overseeing it all.

Wolter Pieters: There’s a couple of issues here. One is that’s often cited is that ethics and law seem to be lagging behind compared to our technological achievements. We always have to wait for new technology to kind of get out of hand before we start thinking about ethics and regulation. So, in a way, you could argue that’s the case for internet of things type developments where manufacturers of products have been making their products smart for quite a while now. And we suddenly realized that all of these things have security vulnerabilities, and they and they can become part of botnets of cameras that can then be used to do distributed denial of attacks on our websites, etc. And only now are we starting to think about what is needed to make sure that these and other things, devices are securable at some level. Can they be updated? Can they be patched? In a way, it already seems to be too late. So, it is the argument then that is lagging behind.

On the other hand, there’s also the point that ethics and norms are always in a way embedded in technologies. And again, in the security space, whatever way you design technology, it will always enable certain kinds of access, and it will disable other kinds of access. So, there’s always this inclusion, exclusion going on with new digital technologies. So, in that sense, increasingly, ethics is always already present in a technology. And I’m not sure whether ethics, whether it should be said that ethics doesn’t scale. Maybe the problem is rather that it scales too well in the sense that, when we design a piece of technology, we can’t really imagine how things are going to work out if the technology is being used by millions of people. So, this holds for a lot of these elements.

And then the internet when it was designed, it was never conceived as a tool that would be used by billions. It was kind of a network for research purposes to exchange data and everything. So, same for Facebook. It was never designed as a platform for an audience like this, which means that, in a sense, that the norms that are initially being embedded into those technologies do scale. And if, for example, for the internet, you don’t embed security in it from the beginning and then you scale it up, then it becomes much more difficult to change it later on. So, ethics does scale, but maybe not in the way that we want it to scale.

Cindy Ng: So, you mentioned Facebook. And Facebook is not the only tech company that design systems to allow data to flow through so many third parties, and when people use that data in a nefarious way, the tech company can respond to say, you know, “It’s not a data breach. It’s how things were designed to work and people misused it.” Why does that response feel so unsettling? I also like what you said in the paper you wrote that we’re tempted to consider technology as morally neutral.

Wolter Pieters: There’s always this idea of technology being kind of a hammer, right? I need a hammer to drive in the nail and so, it’s just a tool. Now, information flow technology has been discussed for a while that there will always be some kind of side effects. And we’ve learned that technologies pollute the environment, technologies cause safety hazards, nuclear incidents etc., etc. And in all of these cases, when something goes wrong, there are people who designed the technology or operate the technology who could potentially be blamed for these things going wrong.

Now, in the security space, we’re dealing with intentional behavior of third parties. So, they can be hackers, they can be people who misuse the technology. And then suddenly it becomes very easy for those designing or operating the technology to point to those third parties as the ones to blame. You know, like, “Yeah. We just provide the platform. They misused it. It’s not our fault.” But the point is, if you follow that line of reasoning, you wouldn’t need to do any kind of security. Just say, “Well, I made a technology that has some useful functions,” and, yes, then there’s these bad guys that misuse my functionality.”

On the one hand, it seems natural to kind of blame the bad guys or the misusers of whatever. On the other hand, if you only follow that line of reasoning, then nobody would need to do any kind of security. So, this means that you can’t really get away with that argument in general. Then, of course, with specific cases, and then it becomes more of a gray area, where does your duty to prevent misuse stop? And then you get into the area, okay, what is an acceptable level of protection security?

But also, of course, there’s the business models of these companies involve giving access to some parties, which the end users may not be fully aware of. And this has to do with security always being about who are the bad guys? Who are the threats? And some people have different ideas about who the threats are than others. So, if a company gets a request from the intelligence services like, “Hey, we need your data because we would like to investigate this suspect.” Is that acceptable or maybe some people see that as a threat as well. So, the labeling of who are the threats? Are the terrorists the threats? Are the intelligence agencies the threats? Are the advertising companies the threats? This all matters in terms of what you would consider acceptable or not from a security point of view.

Within that space, it is often not very transparent to people what could or could not be done with the data. And then the European legislation is trying, in particular, to require consent of people in order to process their data in certain kinds of ways. Now that, in principle, seems like a good idea. In practice, consent is often given without paying too much attention to the exact privacy policies etc., because people can’t be bothered to read all of that. And in a sense, maybe that’s the rational decision because it would take too much time.

So, that also means that, if we try to solve these problems by letting individuals give consent to certain ways of processing their data, this may lead us to a situation where individually, everybody would just click away the messages because for them it’s rational like, “Hey, I want this service and I don’t have time to be bothered with all these legal stuff.” But on a societal level, we are creating a situation where indeed certain stakeholders in the internet get a lot of power because they have a lot of data. This is the space in which decisions are being made.

Cindy Ng: We rely on technology. A lot of people use Facebook. We can’t just say goodbye to IoT devices. We can’t say goodbye to Facebook. We can’t say goodbye to any piece of technology because as you’ve said, in one of your papers, that technology will profoundly change people’s lives, and our society. Instead of saying goodbye to this wonderful thing that we’ve created or things, how do we go about living our lives and conducting ourselves with integrity, with good ethics, and morals?

Wolter Pieters: Yeah. That’s a good question. So, what currently seems to be happening is that, indeed, a lot of this responsibility is being allocated to the end users. Like, you decide whether you want to join social media platforms or not. You decide what to share there. You decide whether to communicate with end-to-end encryption or not, etc., etc. So, this means that a lot of pressure is being put on individuals make those kinds of choices.

And the fundamental question is whether that approach makes sense, whether that approach scales, because the more technologies people are using, the more decisions they will have to make about how to use these kinds of technologies. Now, of course, there are certain basic principles that you can try to adhere to when doing your stuff online. But on the security sides, watch out of phishing emails, use strong passwords etc., etc. On the privacy side, don’t share stuff off from other people that they haven’t agreed to etc., etc.

But all of that requires quite a bit of effort on the side of the individual. And at the same time, there seems to be pressure to share more and more and more stuff even…and, for example, pictures of children that aren’t even able to consent to whether they want their pictures posted or not. So, it’s, in a sense, there’s a high moral demand on users, maybe too high. And that’s a great question.

In terms of acting responsibly online, now, if at some point you would decide that we’re putting too high a demand on those users, and the question is like, “Okay, are there possible ways to make it easier for people to act responsibly?” And then you would end up with certain types of regulation that don’t only delegate responsibility back to individuals, like, for example, asking consent, but putting really very strict rules on what, in principle, is allowed or not.

Now, that’s a very difficult debate because you usually end up also in accusations of paternalism, like, “Hey, you’re putting all kinds of restrictions on what can or cannot be done online.” But why shouldn’t people be able to decide for themselves? For instance, on the other hand, people being overloaded with decisions to the extent that it becomes impossible for them to make those decisions responsibly. This, on the one hand, leaving all kinds of decisions to the individual versus making some decisions on a collective level that’s gonna be a very fundamental issue in the future.

[Podcast] Who is in Control? The Data or Humans?

[Podcast] Who is in Control? The Data or Humans?


Leave a review for our podcast & we'll send you a pack of infosec cards.

Self-quantified trackers made possible what was once nearly unthinkable: for individuals to gather data on one’s activity level in order to manage and improve one’s performance. Some have remarked that self-quantified devices can hinge on the edge of over management. As we wait for more research reports on the right dose of self-management, we’ll have to define for ourselves what the right amount of self-quantifying is.

Meanwhile, it seems that businesses are also struggling with a similar dilemma: measuring the right amount of risk and harm as it relates to security and privacy.

Acting FTC Chairman Maureen Ohlhausen said at a recent privacy and security workshop, “In making policy determinations, injury matters. … If we want to manage privacy and data security injuries, we need to be able to measure them.”

A clearly defined measurement of risk and harm will become ever so important as the business world embrace deep learning and eventually artificial intelligence.

Other articles discussed:

Panelists: Kilian Englert, Mike Thompson, Kris Keyser