Live Cyber Attack Lab 🎯 Watch our IR team detect & respond to a rogue insider trying to steal data! Choose a Session


[Podcast] Dr. Wolter Pieters on Information Ethics, Part Two

Data Security


Leave a review for our podcast & we'll send you a pack of infosec cards.

Get the Free Pen Testing Active Directory Environments EBook

“This really opened my eyes to AD security in a way defensive work never did.”

In part two of my interview with Delft University of Technology’s assistant professor of cyber risk, Dr. Wolter Pieters, we continue our discussion on transparency versus secrecy in security.

We also cover ways organizations can present themselves as trustworthy. How? Be very clear about managing expectations. Declare your principles so that end users can trust that you’ll be executing by the principles you advocate. Lastly, have a plan for know what to do when something goes wrong.

And of course there’s a caveat, Wolter reminds us that there’s also a very important place in this world for ethical hackers. Why? Not all security issues can be solved during the design stage.

Transparency versus Secrecy

Wolter Pieters

My name is Wolter Pieters. I have a background in both computer science and philosophy of technology. I’m very much interested in studying cyber security from an angle that either goes a bit more towards the social science, so, why do people behave in certain ways in the cyber security space. But also more towards philosophy and ethics, so, what would be reasons for doing things differently in order to support certain values.

Privacy, but then again, I think privacy is a bit overrated. This is really about power balance. It’s because everything we do in security will give some people access and exclude other people, and that’s a very fundamental thing. It’s basically about power balance that is through security we embed into technology. And that is what fundamentally interests me in relation to security and ethics.

Cindy Ng

How do we live in now world where you just don’t know whether or not organizations or governments are behaving in a way that’s trustworthy?

Wolter Pieters

You know, transparency versus secrecy is a very important debate within the security space. This already starts out very fundamentally from the question like, “Should methods for protecting information be publicly known or should they be kept secret because otherwise we may be giving too much information away to hackers, etc?” So, this is a very fundamental thing and in terms of encryption already, there’s the principle like, “Hey, encryption algorithms should be publicly known because otherwise we can’t even tell how well our information is being protected by means of that encryption and only the keys using encryption should be kept secret.” This is a principle called Kerckhoff’s Principle. This is very old and information in security and a lot of the current encryption algorithms actually adhere to that principle and we’ve also seen encryption algorithms not adhering to that principle.

So, algorithms that were secrets, trade secrets, etc. being broken very moments the algorithm became known. So, in that sense there I think most researchers would agree this is good practice. On the other hand it’s seems that there’s also a certain limit to what we want to be transparent there. Both in terms of security controls, we’re not giving away every single thing governments do in terms of security online. So, there is some level of security by obscurity there and more generally to what extent is transparency a good thing. This again ties in with who is a threat. I mean, we have the whole WikiLeaks endeavor and some people will say, “Well, this is great. The government shouldn’t be keeping all that stuff secret.” So, it’s great for trust that this is now all out in the open. On the other hand, you could argue all this and this is actually a threat to trust in the government. So, this form of transparency would be very bad for trust.

So, there’s clearly a tension there. Some level of transparency may help people trust in the protections embedded in the technology and in the actors that use those technologies online. But on the other hand, if there’s too much transparency all the nitty-gritty details may actually decrease trust. You see this all over the place. We’ve seen it through with the electronic voting as well. If you provide some level of explanation on how certain technologies are being secured, that may help. If you provide too much detail people won’t understand it and it will only increase distrust. There is a kind of golden middle there in terms of how much explanation you should give to make people trust in certain forms of security encryption, etc. And again, in the end people will have to rely on experts because physical forms of security, physical ballot boxes, it’s possible to explain and how these work and how they are being secured with digital that becomes much more complicated and for most people, they will have to trust the judgment of experts that these forms of security are actually good if the experts believe so.

What Trustworthy Organizations Do Differently

Cindy Ng

What’s something an organization can do in order to establish themselves as a trustworthy, morally-sound, ethical organization?

Wolter Pieters

I think the most important thing that companies can do is very clear in terms of managing expectations. So, couple of examples there, if as a company you decide to provide end-to-end encryption for communications. The people that use your or your jet app exchange messages get the assurance that the messages are encrypted between their device and the device of the one that they’re communicating with. And this is a clear statement like, “Hey, we’re doing it this way.” And that also means that then you shouldn’t have any backdoors or means to give this communication away to need the intelligence agencies anyway. Because if this is your standpoint, and people need to be able to trust in that. Similarly, if you are running a social network site and you want people to trust in your policies then you need to be crystal clear.

Not only that it’s possible to change your privacy settings, to regulate the access that other use of the social networking servers have to your data, but at the same time you need to be crystal clear about how you as a social network operator are using the kind of data. Because sometimes I get the big internet companies are offering all kinds of privacy settings which give people the impression that they can do a lot in terms of their privacy but, yes, this is true for the inter user data access but the provider still sees everything. This seems to be a way of framing privacy in terms of inter user data access. Whereas, I think it’s much more fundamental what these companies can do with all the data they gather for all their use and what that means in terms of their power and the position that they get in this whole area of cyberspace this whole arena.

So, managing expectations, I mean, there’s all kinds of different standpoints also based on different ethical theories, based on different political points of view that you could take in this space. If you want to behave ethically then make sure you list your principles, you list what you do in terms of security and privacy to adhere to those principles and make sure that people can actually trust that this is also what you do in practice. And also make sure that you know exactly what you’re going to do in case something goes wrong anyway. We’ve seen too many breaches where the responses by the companies were not quite up to standards in terms of delaying the announcement of the breach or it’s crucial to not only do some prevention in terms of security and privacy but also know what you’re going to do in case something goes wrong.

Doomsday Scenarios

Cindy Ng

Yeah, you say, if an IoT device gets created and they get to market their product first and then they’ll fix security and privacy later, that’s too late. Is it sort of like, “We’re doomed already and we’re just sort of managing the best way we know how?”

Wolter Pieters

In a way, it’s a good thing when we are nervous about where our society is going because in history at moments where people weren’t nervous enough about where society was going, we’ve seen things go terribly wrong. So, in a sense we need to get rid of the illusion that we can easily be in control or something like that because we can’t.

The same for elections, there is no neutral space from which people can cast their vote without being influenced and we’ve seen in recent elections that actually technology is playing more and more of a role in how people perceive political parties and how to make decisions in terms of voting. So, it’s inevitable that technology companies have a role in those elections and that’s also what they need to acknowledge.

And then of course, and I think this is a big question that needs to be asked, “Can we prevent the situation in which the power of certain online stakeholders whether those are companies or are there for a nation state or whatever. Can we prevent a situation in which they get so much power that they are able to influence our governments, either through elections or through other means?” That’s a situation that we really don’t want to be in and I’m not pretending that I have a crystal clear answers there but this is something that at least we should consider as a possible scenario.

And then there’s all these doomsday scenarios with Cyber Pearl Harbor and I’m not sure whether these doomsday scenarios are the best way to think about this but we should also not be naive and think that all of this will blow over because maybe indeed we have already been giving away too much power in a sense. So, what we should do is fundamentally rethink the way we think about security and privacy from, “Oh, damn, my photos are I don’t know whatever, in the hands of whoever.” That’s not the point. It’s about the scale in which the certain actors either get their hands on data or lots of individuals are able to influence lots of individuals. So, again scale comes in there. It’s not about our individual privacy, it’s about the power that these stakeholders get by having access to the data over by being able to influence lots and lots of people and that’s what the debate needs to be about.

Cindy Ng

Whoever has the data has power, is what you’re getting at.

Wolter Pieters

Whoever has the data and in a sense that data can then, again, be used also to influence people in a targeted way. If you know that somebody’s interested in something, you can try to influence their behavior by referring to the thing that they’re interested in.

Cindy Ng

That’s only if you have data integrity.

Wolter Pieters

Yes. Yes, of course. But on the other hand, little bit of noise in the data doesn’t matter too much because if you have data that’s more or less correct, you can still achieve a lot.

Ethical Hackers Have An Important Role

Cindy Ng

Anything that I didn’t touch upon that you think is important for our listeners to know?

Wolter Pieters

The one thing that I think is critically important is the role that ethical hackers can have in keeping people alert, in a way maybe even changing the rules of the game, because in the end I also don’t think that all security issues can be solved in the design of technology and it’s critically important that when technology are being deployed that people keep an eye on issues that may have been overlooked in the design stage of those technologies. We need some people that are paying attention and will be alerting us to issues that may emerge.

Cindy Ng

It’s a scary role to be in though if you’re an ethical hacker because what if the government comes around and accuses you being an unethical hacker?

Wolter Pieters: Yeah. I think that’s an issue but if that’s going to be happening, if people are afraid to play this role because legislation doesn’t protect them enough, then maybe we need to do something about that. If we don’t have people that point us to essential weaknesses in security, then what will happen is that those issues will be kept secret and that they will be misused in ways that we don’t know about and I think that’s much worse situation to be in.


Cindy Ng

Cindy Ng

Cindy is the host of the Inside Out Security podcast.


Does your cybersecurity start at the heart?

Get a highly customized data risk assessment run by engineers who are obsessed with data security.