Tag Archives: nist

NIST 800-53: Definition and Tips for Compliance

nist 800-53

NIST sets the security standards for agencies and contractors – and given the evolving threat landscape, NIST is influencing data security in the private sector as well. It’s structured as a set of security guidelines, designed to prevent major security issues that are making the headlines nearly every day.

NIST SP 800-53 Defined

The National Institute of Standards and Technology – NIST for short – is a non-regulatory agency of the U.S. Commerce Department, tasked with researching and establishing standards across all federal agencies. NIST SP 800-53 defines the standards and guidelines for federal agencies to architect and manage their information security systems. It was established to provide guidance for the protection of agency’s and citizen’s private data.

nist 800 53 definition

Federal agencies must follow these standards, and the private sector should follow the same guidelines.

NIST SP 800-53 breaks the guidelines up into 3 Minimum Security Controls spread across 18 different control families.

Minimum Security Controls:

  • High-Impact Baseline
  • Medium-Impact Baseline
  • Low-Impact Baseline

Control Families:

What’s The Purpose of NIST SP 800-53

NIST SP 800-53 sets basic standards for information security policies for federal agencies – it was created to heighten the security (and security policy) of information systems used in the federal government.

The overall idea is that federal organizations first determine the security category of their information system based on FIPS Publication 199, Standards for Security Categorization of Federal Information and Information Systems — essentially deciding whether the security objective is confidentiality, integrity, or availability.

NIST SP 800-53 then helps explain which standards apply to each goal – and provides guidance on how to implement them. NIST SP 800-53 does not define any required security applications or software packages, instead leaving those decisions up to the individual agency.

NIST has iterated on the standards since their original draft to keep up with the changing world of information security, and the SP 800-53 is now in its 4th revision dated January 22, 2015. The 5th revision is currently up for comments – stay tuned for updates.

Benefits of NIST SP 800-53

NIST SP 800-53 is an excellent roadmap to covering all the basics for a good data security plan. If you establish policies and procedures and applications to cover all 18 of the areas, you will be in excellent shape.

Once you have the baseline achieved, you can further improve and secure your system by adding additional software, more stringent requirements, and enhanced monitoring.

Data security, like NIST SP 800-53, is evolving rapidly. A data security team needs to constantly look for more ways to reduce the risk of a data breach and to protect their data from insider threats and malware. The Varonis Data Security Platform maps to many of the basic requirements for NIST, and reduces your overall risk profile throughout the implementation process and into the future.

NIST 800-53 Compliance Best Practices

nist 800 53 compliance best practices

Implement these basic principles to data security to work towards NIST 800-53 compliance:

  • Discover and Classify Sensitive Data
    Locate and secure all sensitive data
    Classify data based on business policy
  • Map Data and Permissions
    Identify users, groups, folder and file permissions
    Determine who has access to what data
  • Manage Access Control
    Identify and deactivate stale users
    Manage user and group memberships
    Remove Global Access Groups
    Implement a least privilege model
  • Monitor Data, File Activity, and User Behavior
    Audit and report on file and event activity
    Monitor for insider threats, malware, misconfigurations and security breaches
    Detect security vulnerabilities and remediate

Compliance with NIST 800 53 is a perfect starting point for any data security strategy. The new GDPR regulations coming in May 2018 shine a spotlight on data security compliance guidelines in Europe, and changes are already coming to state legislation in the US that will implement additional requirements on top of NIST 800 53. As new legislation rolls out, achieving and maintaining compliance with the current baseline will make much easier to meet updated requirements.

NIST sets the security standards for internal agencies – building blocks for common sense security standards. Want to learn more? See how Varonis maps to NIST 800 53 and can help meet NIST standards.

Risk Management Framework (RMF): An Overview

risk framework management

The Risk Management Framework (RMF) is a set of criteria that dictate how United States government IT systems must be architected, secured, and monitored. Originally developed by the Department of Defense (DoD), the RMF was adopted by the rest of the US federal information systems in 2010.

Today, the RMF is maintained by the National Institute of Standards and Technology (NIST), and provides a solid foundation for any data security strategy.

What is the Risk Management Framework (RMF)?

The elegantly titled “NIST SP 800-37 Rev.1” defines the RMF as a 6-step process to architect and engineer a data security process for new IT systems, and suggests best practices and procedures each federal agency must follow when enabling a new system. In addition to the primary document SP 800-37, the RMF uses supplemental documents SP 800-30, SP 800-53, SP 800-53A, and SP 800-137.

Risk Management Framework (RMF) Steps

We’ve visualized the RMF 6-step process below. Browse through the graphic and take a look at the steps in further detail beneath.

risk management framework steps

Step 1: Categorize Information System 

The Information System Owner assigns a security role to the new IT system based on mission and business objectives. The security role must be consistent with the organization’s risk management strategy.

Step 2: Select Security Controls 

The security controls for the project are selected and approved by leadership from the common controls, and supplemented by hybrid or system-specific controls. Security controls are the hardware, software, and technical processes required to fulfill the minimum assurance requirements as stated in the risk assessment. Additionally, the agency must develop plans for continuous monitoring of the new system during this step.

Step 3: Implement Security Controls 

Simply put, put step 2 into action. By the end of this step, the agency should have documented and proven that they have achieved the minimum assurance requirements and demonstrated the correct use of information system and security engineering methodologies.

Step 4: Assess Security Controls 

An independent assessor reviews and approves the security controls as implemented in step 3. If necessary, the agency will need to address and remediate any weaknesses or deficiencies the assessor finds and then documents the security plan accordingly.

Step 5: Authorize Information System

The agency must present an authorization package for risk assessment and risk determination. The authorizing agent then submits the authorization decision to all necessary parties.

Step 6: Monitor Security Controls

The agency continues to monitor the current security controls and update security controls based on changes to the system or the environment. The agency regularly reports on the security status of the system and remediates any weaknesses as necessary.

How Can Varonis Help You Be Compliant?

NIST regulation and the RMF (in fact, many of the data security standards and compliance regulations) have three areas in common:

  • Identify your sensitive and at risk data and systems (including users, permissions, folders, etc.);
  • Protect that data, manage access, and minimize the risk surface;
  • Monitor and detect what’s happening on that data, who’s accessing it, and identify when there is suspicious behavior or unusual file activity.

The Varonis Data Security Platform enables federal agencies to manage (and automate) many of these practices and regulations required in the RMF.

DatAdvantage and Data Classification Engine identifies sensitive data on core data stores, and maps user, group, and folder permissions so that you can identify where your sensitive data is and who can access it. Knowing who has access to your data is a key component of the risk assessment phase, defined in NIST SP 800-53.

Data security analytics helps meet the NIST SP 800-53 requirement to constantly monitor your data: Varonis analyzes monitored data against dozens of threat models that warn you of ransomware, malware, misconfigurations, insider attacks, and more.

NIST SP 800-137 establishes guidelines to protect your data and requires that the agency meet a least privilege model. DatAdvantage, Automation Engine, and DataPrivilege streamline permissions and access management, and provide a way to more easily get to least privilege and automate permissions cleanup.

While the Risk Management Framework is complex on the surface, ultimately it’s a no-nonsense and logical approach to good data security practices at its core – see how Varonis can help you meet the NIST SP 800-37 RMF guidelines today.

A Few Thoughts on Data Security Standards

A Few Thoughts on Data Security Standards

Did you know that the 462-page NIST 800-53 data security standard has 206 controls with over 400 sub-controls1?  By the way, you can gaze upon the convenient XML-formatted version here. PCI DSS is no slouch either with hundreds of sub-controls in its requirements’ document. And then there’s the sprawling IS0 27001 data standard.

Let’s not forget about security frameworks, such as COBIT and NIST CSF, which are kind of meta-standards that map into other security controls. For organizations in health or finance that are subject to US federal data security rules, HIPAA and GLBA’s data regulations need to be considered as well. And if you’re involved in the EU market, there’s GDPR; in Canada, it’s PIPEDA; in the Philippines, it’s this, etc., etc.

There’s enough technical and legal complexity out there to keep teams of IT security pros, privacy attorneys, auditors, and diplomats busy till the end of time.

As a security blogger, I’ve also puzzled and pondered over the aforementioned standards and regulations. I’m not the first to notice the obvious: data security standards fall into patterns that make them all very similar.

Security Control Connections

If you’ve mastered and implemented one, then very likely you’re compliant to others as well. In fact, that’s one good reason for having frameworks. For example, with, say NIST CSF, you can leverage your investment in ISO 27001 or ISA 62443 through their cross-mapped control matrix (below).

Got ISO 27001? Then you’re compliant with NIST CSF!

I think we can all agree that most organizations will find it impossible to implement all the controls in a typical data standard with the same degree of attention— when was last time you checked the physical access audit logs to your data transmission assets (NIST 800-53, PE-3b)?

So to make it easier for companies and the humans that work there, some of the standards group have issued further guidelines that break the huge list of controls into more achievable goals.

The PCI group has a prioritized approach to dealing with their DSS—they have six practical milestones that are broken into a smaller subset of relevant controls. They also have a best practices guide that views — and this is important — security controls into three broader functional areas: assessment, remediation, and monitoring.

In fact, we wrote a fascinating white paper explaining these best practices, and how you should be feeding back the results of monitoring into the next round of assessments. In short: you’re always in a security process.

NIST CSF, which itself is a pared down version of NIST 800-53, also has a similar breakdown of its controls into broader categories, including identification, protection, and detection. If you look more closely at the CSF identification controls, which mostly involve inventorying your IT data assets and systems, you’ll see that the main goal in this area is to evaluate or assess the security risks of the assets that you’ve collected.

File-Oriented Risk Assessments

In my mind, the trio of assess, protect, and monitor is a good way to organize and view just about any data security standard.

In dealing with these data standards, organizations can also take a practical short-cut through these controls based on what we know about the kinds of threats appearing in our world — and not the one that data standards authors were facing when they wrote the controls!

We’re now in a new era of stealthy attackers who enter systems undetected, often though phish mails, leveraging previously stolen credentials, or zero-day vulnerabilities. Once inside, they can fly under the monitoring radar with malware-free techniques, find monetizable data, and then remove or exfiltrate it.

Of course it’s important to assess, protect and monitor network infrastructure, but these new attack techniques suggest that the focus should be inside the company.

And we’re back to a favorite IOS blog theme. You should really be making it much harder for hackers to find the valuable data — like credit card or account numbers, corporate IP — in your file systems, and detect and stop the attackers as soon as possible.

Therefore, when looking at the how to apply typical data security controls, think file systems!

For, say, NIST 800.53, that means scanning file systems, looking for sensitive data, examining the ALCs or permissions and then assessing the risks (CM-8, RA-2,RA-3). For remediation or protection, this would involve reorganizing Active Directory groups and resetting ACLs to be more exclusive (AC-6). For detection, you’ll want to watch for unusual file system accesses that likely indicate hackers borrowing employee credentials (SI-4).

I think the most important point is not to view these data standards as just an enormous list of disconnected controls, but instead to consider them in the context of assess-protect-monitor, and then apply them to your file systems.

I’ll have more to say on a data or file-focused view of data security controls in the coming weeks.

1 How did I know that NIST 800-53 has over 400 sub-controls? I took the XML file and ran this amazing two lines of PowerShell:

[xml]$books = Get-Content 800-53-controls.xml
$books.controls.control|%{$_.statement.statement.number}| measure -line

 

Data Security Compliance and DatAdvantage, Part III:  Protect and Monitor

Data Security Compliance and DatAdvantage, Part III:  Protect and Monitor

This article is part of the series "Data Security Compliance and DatAdvantage". Check out the rest:

At the end of the previous post, we took up the nuts-and-bolts issues of protecting sensitive data in an organization’s file system. One popular approach, least-privileged access model, is often explicitly mentioned in compliance standards, such as NIST 800-53 or PCI DSS. Varonis DatAdvantage and DataPrivilege provide a convenient way to accomplish this.

Ownership Management

Let’s start with DatAdvantage. We saw last time that DA provides graphical support for helping to identify data ownership.

If you want to get more granular than just seeing who’s been accessing a folder, you can view the actual access statistics of the top users with the Statistics tab (below).

This is a great help in understanding who is really using the folder. The ultimale goal is to find the true users, and remove extraneous groups and users, who perhaps needed occasional access but not as part of their job role.

The key point is to first determine the folder’s owner — the one who has the real knowledge and wisdom of what the folder is all about. This may require some legwork on IT’s part in talking to the users, based on the DatAdvantage stats, and working out the real-chain of command.

Once you use DatAdvantage to set the folder owners (below), these more informed power users, as we’ll see, can independently manage who gets access and whose access should be removed. The folder owner will also automatically receive DatAdvantage reports, which will help guide them in making future access decisions.

There’s another important point to make before we move one. IT has long been responsible for provisioning access, without knowing the business purpose. Varonis DatAdvantage assists IT in finding these owners and then giving them the access granting powers.

Anyway, once the owner has done the housekeeping of paring and removing unnecessary folder groups, they’ll then want to put into place a process for permission management. Data standards and laws recognize the importance of having security policies and procedures as part of on-going program – i.e., not something an owner does once a year.

And Varonis has an important part to play here.

Maintaining Least-Privileged Access

How do ordinary users whose job role now requires then to access a managed folder request permission to the owner?

This is where Varonis DataPrivilege makes an appearance. Regular users will need to bring this interface up (below) to formally request access to a managed folder.

The owner of the folder has a parallel interface from which to receive these requests and then grant or revoke permissions.

As I mentioned above, these security ideas for least-privilege-access and permission management are often explicitly part of compliance standards and data security laws. Building on my list from the previous post, here’s a more complete enumeration of controls that Varonis DatAdvantage supports:

  • NIST 800-53: AC-2, AC-3, AC-5, CM-5
  • NIST 800-171: 3.1.4, 3.1.5, 3.4.5
  • PCI DSS 3.x: 7.1,7.2
  • HIPAA: 45 CFR 164.312 a(1), 164.308a(4)
  • ISO 27001: A.6.1.2, A.9.1.2, A.9.2.3, A11.2.2
  • CIS Critical Security Controls: 14.4
  • New York State DFS Cybersecurity Regulations: 500.07

Stale Sensitive Data

Minimization is an important theme in security standards and laws. These ideas are best represented in the principles of Privacy by Design (PbD), which has good overall advice on this subject: minimize the sensitive data you collect, minimize who gets to see it, and minimize how long you keep it.

Let’s address the last point, which goes under the more familiar name of data retention. One low-hanging fruit to reducing security risks is to delete or archive sensitive data embedded in files.

This make incredible sense, of course. This stale data can be, for example, consumer PII collected in short-term marketing campaigns, but now residing in dusty spread-sheets or rusting management presentations.

Your organization may no longer need it, but it’s just the kind of monetizable data that hackers love to get their hands on.

As we saw in the first post, which focused on Identification, DatAdvantage can find and identify file data that hasn’t been used after a certain threshold date.

Can the stale data report be tweaked to find stale data this is also sensitive?

Affirmative.

You need to add the hit count filter and set the number of sensitive data matches to an appropriate number.

In my test environment, I discovered that C:Share\pvcs folder hasn’t been touched in over a year and has some sensitive data.

The next step is then to take a visit to the Data Transport Engine (DTE) available in DatAdvantage (from the Tools menu). It allows you to create a rule that will search for files to archive and delete if necessary.

In my case, my rule’s search criteria mirrors the same filters used in generating the report. The rule is doing the real heavy-lifting of removing the stale, sensitive data.

Since the rule is saved, it can be rerun again to enforce the retention limits. Even better, DTE can automatically run the rule on a periodic basis so then you never have to worry about stale sensitive data in your file system.

Implementing date retention policies can be found in the following security standards and regulations:

  • NIST 800-53: SI-12
  • PCI DSS 3.x: 3.1
  • CIS Critical Security Controls: 14.7
  • New York State DFS Cybersecurity Regulations: 500.13
  • EU General Data Protection Regulation (GDPR): Article 25.2

Detecting and Monitoring

Following the order of the NIST higher-level security control categories from the first post, we now arrive at our final destination in this series, Detect.

No data security strategy is foolproof, so you need a secondary defense based on detection and monitoring controls: effectively you’re watching the system and looking for unusual activities.

Varonis and specifically DatAlert has unique role in detection because its underlying security platform is based on monitoring file system activities.

By now everyone knows (or should know) that phishing and injection attacks allow hackers to get around network defenses as they borrow existing users’ credentials, and fully-undetectable (FUD) malware means they can avoid detection by virus scanners.

So how do you detect the new generation of stealthy attackers?

No attacker can avoid using the file system to load their software, copy files, and crawl a directory hierarchy looking for sensitive data to exfiltrate. If you can spot their unique file activity patterns, then you can stop them before they remove or exfiltrate the data.

We can’t cover all of DatAlert’s capabilities in this post — probably a good topic for a separate series! — but since it has deep insight to all file system information and events, and histories of user behaviors, it’s in a powerful position to determine what’s out of the normal range for a user account.

We call this user behavior analytics or UBA, and DatAlert comes bundled with a suite of UBA threat models (below).  You’re free to add your own, of course, but the pre-defined models are quite powerful as is. They include detecting crypto intrusions, ransomware activity, unusual user access to sensitive data, unusual access to files containing credentials, and more.

All the alerts that are triggered can be tracked from the DatAlert Dashboard.  IT staff can either intervene and respond manually or even set up scripts to run automatically — for example, automatically disable accounts.

If a specific data security law or regulations requires a breach notification to be sent to an authority, DatAlert can provide some of the information that’s typically required – files that were accessed, types of data, etc.

Let’s close out this post with a final list of detection and response controls in data standards and laws that DatAlert can help support:

  • NIST 800-53: SI-4, AU-13, IR-4
  • PCI DSS 3.x: 10.1, 10.2, 10.6
  • CIS Critical Security Controls: 5.1, 6.4, 8.1
  • HIPAA: 45 CFR 164.400-164.414
  • ISO 27001: A.16.1.1, A.16.1.4
  • New York State DFS Cybersecurity Regulations: 500.02, 500.16, 500.27
  • EU General Data Protection Regulation (GDPR): Article 33, 34
  • Most US states have breach notification rules

Data Security Compliance and DatAdvantage, Part II:  More on Risk Assessme...

Data Security Compliance and DatAdvantage, Part II:  More on Risk Assessment

This article is part of the series "Data Security Compliance and DatAdvantage". Check out the rest:

I can’t really overstate the importance of risk assessments in data security standards. It’s really at the core of everything you subsequently do in a security program. In this post we’ll finish discussing how DatAdvantage helps support many of the risk assessment controls that are in just about every security law, regulation, or industry security standard.

Last time, we saw that risk assessments were part of NIST’s Identify category. In short: you’re identifying the risks and vulnerabilities in your IT system. Of course, at Varonis we’re specifically focused on sensitive plain-text data scattered around an organization’s file system.

Identify Sensitive Files in Your File System

As we all know from major breaches over the last few years, poorly protected folders is where the action is for hackers: they’ve been focusing their efforts there as well.

The DatAdvantage 4a report I mentioned in the last last post is used for finding sensitive data in folders with global permissions. Varonis uses various built-in filters or rules to decide what’s considered sensitive.

I counted about 40 or so such rules, covering credit card, social security, and various personal identifiers that are required to be protected by HIPAA and other laws. And with our new GDPR Patterns there are now filters  —  over 250! —covering phone, license, and national IDs for EU countries

Identify Risky and Unnecessary Users Accessing Folders

We now have a folder that is a potential source of data security risk. What else do we want to identify?

Users that have accessed this folder is a good starting point.

There are a few ways to do this with DatAdvantage, but let’s just work with the raw access audit log of every file event on a server, which is available in the 2a report. By adding a directory path filter, I was able to narrow down the results to the folder I was interested in.

So now we at least know who’s really using this specific folder (and sub-folders).  Often times this is a far smaller pool of users then has been enabled through the group permissions on the folders. In any case, this should be the basis of a risk assessment discussion to craft more tightly focused groups for this folder and setting an owner who can then manage the content.

In the Review Area of DatAdvantage, there’s more graphical support for finding users accessing folders, the percentage of the Active Directory group who are actually using the folder, as well as recommendations for groups that should be accessing the folder. We’ll explore this section of DataAdvantage further below.

For now, let’s just stick to the DatAdvantage reports since there’s so much risk assessment power bundled into them.

Another similar discussion can be based on using the 12l report to analyze folders containing sensitive data but have global access – i.e., includes the Everyone group.

There are two ways to think about this very obvious risk. You can remove the Everyone access on the folder. This can and likely will cause headaches for users. DatAdvantage conveniently has a sandbox feature that allows you to test this.

On the other hand, there may be good reasons the folder has global access, and perhaps there are other controls in place that would (in theory) help reduce the risk of unauthorized access. This is a risk discussion you’d need to have.

Another way to handle this is to see who’s copying files into the folder — maybe it’s just a small group of users — and then establish policies and educate these users about dealing with sensitive data.

You could then go back to the 1A report, and set up filters to search for only file creation events in these folders, and collect the user names (below).

Who’s copying files into my folder?

After emailing this group of users with followup advice and information on copying, say, spreadsheets with credit card numbers, you can run the 12l reports the next month to see if any new sensitive data has made its way into the folder.

The larger point is that the DatAdvantage reports help identify the risks and the relevant users involved so that you can come up with appropriate security policies — for example, least-privileged access, or perhaps looser controls but with better monitoring or stricter policies on granting access in the first place. As we’ll see later on in this series, Varonis DatAlert and DataPrivilege can help enforce these policies.

In the previous post, I listed the relevant controls that DA addresses for the core identification part of risk assessment. Here’s a list of risk assessment and policy making controls in various laws and standards where DatAdvantage can help:

  • NIST 800-53: RA-2, RA-3, RA-6
  • NIST 800-171: 3.11.1
  • HIPAA:  164.308(a)(1)(i), 164.308(a)(1)(ii)
  • Gramm-Leach-Bliley: 314.4(b),(c)
  • PCI DSS 3.x: 12.1,12.2
  • ISO 27001: A.12.6.1, A.18.2.3
  • CIS Critical Security Controls: 4.1, 4.2
  • New York State DFS Cybersecurity Regulations: 500.03, 500.06

Thou Shalt Protect Data

A full risk assessment program would also include identifying external threats—new malware, new hacking techniques. With this new real-world threat intelligence, you and your IT colleagues should go back re-adjust the risk levels you’ve assigned initially and then re-strategize.

It’s an endless game of cyber cat-and-mouse, and a topic for another post.

Let’s move to the next broad functional category, Protect. One of the critical controls in this area is limiting access to only authorized users. This is easier said done, but we’ve already laid the groundwork above.

The guiding principles are typically least-privileged-access and role-based access controls. In short: give appropriate users just the access they need to their jobs or carry out roles.

Since we’re now at a point where we are about to take a real action, we’ll need to shift from the DatAdvantage Reports section to the Review area of DatAdvantage.

The Review Area tells me who’s been accessing the legal\Corporate folder, which turns out to be a far smaller set than has been given permission through their group access rights.

To implement least-privilege access, you’ll want to create a new AD group for just those who really, truly need access to the legal\Corporate folder. And then, of course, remove the existing groups that have been given access to the folder.

In the Review Area, you can select and move the small set of users who really need folder access into their own group.

Yeah, this assumes you’ve done some additional legwork during the risk assessment phase — spoken to the users who accessed Corporate\legal folder, identified the true data owners, and understood what they’re using this folder for.

DatAdvantage can provide a lot of support in narrowing down who to talk to. So by the time you’re ready to use the Review Area to make the actual changes, you already should have a good handle on what you’re doing.

One other key control, which will discuss in more detail the next time, is managing file permission for the folders.

Essentially, that’s where you find and assign data owners, and then insure that there’s a process going forward to allow the owner to decide who gets access. We’ll show how Varonis has a key role to play here through both DatAdvatange and DataPrivilege.

I’ll leave you with this list of least permission and management controls that Varonis supports:

  • NIST 800-53: AC-2, AC-3, AC-6
  • NIST 800-171: 3.14,3.15
  • PCI DSS 3.x: 7.1
  • HIPAA: 164.312 a(1)
  • ISO 27001: A.6.1.2, A.9.1.2, A.9.2.3
  • CIS Critical Security Controls: 14.4
  • New York State DFS Cybersecurity Regulations: 500.07
Continue reading the next post in "Data Security Compliance and DatAdvantage"

Data Security Compliance and DatAdvantage, Part I:  Essential Reports for ...

Data Security Compliance and DatAdvantage, Part I:  Essential Reports for Risk Assessment

This article is part of the series "Data Security Compliance and DatAdvantage". Check out the rest:

Over the last few years, I’ve written about many different data security standards, data laws, and regulations. So I feel comfortable in saying there are some similarities in the EU’s General Data Protection Regulation, the US’s HIPAA rules, PCI DSS, NIST’s 800 family of controls and others as well.

I’m really standing on the shoulders of giants, in particular the friendly security standards folks over at the National Institute of Standards and Technology (NIST), in understanding the inter-connectedness. They’re the go-to people for our government’s own data security standards: for both internal agencies (NIST 800-53) and outside contractors (NIST 800-171).  And through its voluntary Critical Infrastructure Security Framework, NIST is also influencing data security ideas in the private sector as well.

One of their big ideas is to divide security controls, which every standard and regulation has in one form or another, into five functional areas: Identify, Protect, Detect, Respond, and Recover. In short, give me a data standard and you can map their controls into one of these categories.

The NIST big picture view of security controls.

The idea of commonality led me to start this series of posts about how our own products, principally Varonis DatAdvantage, though not targeted at any specific data standard or law, in fact can help meet many of the key controls and legal requirements. In fact, the out-of-the-box reporting feature in DatAdvantage is a great place to start to see how all this works.

In this first blog post, we’ll focus on DA reporting functions that roughly cover the identify category. This is a fairly large area in itself, taking in asset identification, governance, and risk assessment.

Assets: Users, Files, and More

For DatAdvatange, users, groups, and folders are the raw building blocks used in all its reporting. However, if you wanted to view pure file system asset information, you can go to the following three key reports in DatAdvantage.

The 3a report gives IT staff a listing of Active Directory group membership. For starters, you could run the report on the all-encompassing Domain Users group to get a global user list (below). You can also populate the report with any AD property associated with a user (email, managers, department, location, etc.)

For folders, report 4f provides access paths, size, number of subfolder, and the share path.

Beyond a vanilla list of folders, IT security staff usually wants to dig a little deeper into the file structure in order to identify sensitive or critical data. What is critical will vary by organization, but generally they’re looking for personally identifiable information (PII), such as social security numbers, email addresses, and account numbers, as well as intellectual property (proprietary code, important legal documents, sales lists).

With DatAdvantage’s 4g report, Varonis lets security staff zoom into folders containing sensitive PII data, which is often scattered across huge corporate file systems. Behind the scenes, the Varonis classification engine has scanned files using PII filters for different laws and regulations, and rated the files based on the number of hits — for example, number of US social security numbers or Canadian driver’s license numbers.

The 4g report lists these sensitive files from highest to lowest “hit” count. By the way, this is the report our customers often run first and find  very eye-opening —especially if they were under the impression that there’s ‘no way millions of credit card numbers could be found in plaintext’.

Assessing the Risks

We’ve just seen how to view nuts-and-bolts asset information, but the larger point is to use the file asset inventory to help security pros discover where an organization’s particular risks are located.

In other words, it’s the beginning of a formal risk assessment.

Of course, the other major part of assessment is to look (continuously) at the threat environment and then be on the hunt for specific vulnerabilities and exploits. We’ll get to that in a future post.

Now let’s use DatAdvantage for risk assessments, starting with users.

Stale user accounts are an overlooked scenario that has lots of potential risk. Essentially, user accounts are often not disabled or removed when an employee leaves the company or a contractor’s temporary assignment is over.

For the proverbially disgruntled employee, it’s not unusual for this former insider to still have access to his account.  Or for hackers to gain access to a no-longer used third-party contractor’s account and then leverage that to hop into their real target.

In DatAdvantage’s 3a report, we can produce a list of stale users accounts based on the last logon time that’s maintained by Active Directory.

The sensitive data report that we saw earlier is the basis for another risk assessment report. We just have to filter on folders that have “everyone” permissions.

Security pros know from the current threat environment that phishing or SQL injection attacks allow an outsider to get the credentials of an insider. With no special permissions, a hacker would then have automatic access to folders with global permissions.

Therefore there’s a significant risk in having sensitive data in these open folders (assuming there’s no other compensating controls).

DatAdvantage’s 4a report nicely shows where these files are.

Let’s take a breath.

In the next post, we’ll continue our journey through DatAdvantage by finishing up with the risk assessment area and then focusing on the Protect and Defend categories.

For those compliance-oriented IT pros and other legal-istas, here’s a short list of regulations and standards (based on our customers requests) that the above reports help support:

  • NIST 800-53: IA-2,CM-8
  • NIST 800-171: 3.51
  • HIPAA:  45 CFR 164.308(a)(1)(ii)(A)
  • GLBA: FTC Safeguards Rule (16 CFR 314.4)
  • PCI DSS 3.x: 12.2
  • ISO 27001: A.7.1.1
  • New York State DFS Cybersecurity Regulations: 500.02
  • EU GDPR: Records of Processing (Article 30), Security of Processing (Article 32) and Impact Assessments (Article 35)
Continue reading the next post in "Data Security Compliance and DatAdvantage"

The Federal Trade Commission Likes the NIST Cybersecurity Framework (and Yo...

The Federal Trade Commission Likes the NIST Cybersecurity Framework (and You Should Too)

Remember the Cybersecurity Framework that was put together by the folks over at the National Institute of Standards and Technology (NIST)?  Sure you do! It came about because the US government wanted to give the private sector, specifically the critical infrastructure players in transportation and energy, a proven set of data security guidelines.

The Framework is based heavily on NIST’s own 800-53, a sprawling 400-page set of privacy and security controls used within the federal government.

To make NIST 800.53 more digestible for the private sector, NIST reorganized and condensed the most important controls and concepts.

Instead of 18 broad control categories with zillions of subcontrols that’s found in the  original, the Cybersecurity Framework — check out the document — is  broken up into just five functional categories – Identify, Protect, Detect, Respond, and Recover — with a manageable number of controls under these groupings.

Students and fans of NIST 800-53 will recognize some of the same two-letter abbreviations being used in the Cybersecurity Framework (see below).

crit-nist-categories

NIST Cybersecurity: simplified functional view of security controls.

By the way, this is a framework. And that means you use the Framework for Improving Critical Infrastructure Cybersecurity – the official name — to map into your favorite data security standard.

Currently, the Framework supports mappings into (not surprisingly) NIST 800.53, but also the other usual suspects, including COBIT 5, SANS CSC, ISO 270001, and ISA 62443.

Keep in mind that the Cybersecurity Framework is an entirely voluntary set of guidelines—none of the infrastructure companies are required to implement it.

The FTC’s Announcement

Since this is such a great set of data security guidelines for critical infrastructure, could the Cybersecurity Framework also serve the same purpose for everyone else—from big box retailers to e-commerce companies?

The FTC thinks so! At the end of August, the FTC announced on its blog that it has given the Cybsecurity Framework its vote of approval.

Let me explain what this means. As a regulatory agency, the FTC is responsible for enforcing powerful regulations, including Gramm-Leach-Blilely, COPPA, and FCRA, as well as its core statutory function of policing “unfair or deceptive acts or practices.”

When dealing with data security or privacy related implications of the laws, the FTC needs a benchmark for reasonable security measures. Or as they put it, “the FTC’s cases focus on whether the company has undertaken a reasonable process to secure data.”

If a company follows the Cybersecurity Framework, is this considered implementing a reasonable process?

The answer is in the affirmative according to the FTC. Or in FTC bureaucratic-speak, the enforcement actions they’ve taken against companies for data security failings “align well with the Framework’s Core functions.”

Therefore if you identify risks (Identify), put in place security safeguards (Protect), continually monitor for threats (Detect), implement a breach response program (Respond), and have a way to restore functions after an incident (Recover), you’ll likely not hear from the FTC regulators.

By the way, check out their Start with Security, a common sense guide to data security, which contain some very Varonis-y ideas.

We approve!

The Essential Guide to Identifying Your Organization’s Most Sensitive Con...

The Essential Guide to Identifying Your Organization’s Most Sensitive Content

What do hackers want? If you answered money — always a safe bet — then you’d be right. According to the Verizon Data Breach Investigations Report (DBIR), financial gain still is the motivation for over 75% of incidents it had investigated.

A better answer to the above question is that hackers want data — either monetizeable or sensitive content — that is scattered across large corporate file systems. These are the unencrypted user-generated files (internal documents, presentations, spreadsheets) that are part of the work environment. Or if not directly created by users, these files can be exported from structured databases containing customer accounts, financial data, sales projections, and more.

Our demand for this data has grown enormously and so have our data storage assets. Almost 90% of the world’s data was created over the last 2 years alone, and by 2020 data will increase by 4,300% — that works out to lots of spreadsheets!

Challenges of Data Security

Unfortunately, the basic tools that IT admins use to manage corporate content – often those that are bundled with the operating systems — are not up to the task of finding and securing the data.

While you’ll need outside vendors to help protect your data assets,  it doesn’t necessarily mean there’s been an agreement on the best way to do this. Sure you can try to lock the virtual doors through firewalls and intrusions systems — simply preventing anyone from getting in.

Or you can take a more realistic approach and assume the hackers will get in.

Security From the Inside Out

What we’ve learned over the last few years after a string of successful attacks against well-defended companies is that it’s impossible to build an intrusion-proof wall around your data.

Why?

Hackers are being invited in by employees–they enter in through the front door. The weapon of choice is phish mail: the attacker, pretending to represent a well-known brand (FedEx, UPS, Microsoft), sends an email to an employee containing a file attachment that appears to be an invoice or other business document

When employees click, they are in fact launching the malware payload, which is embedded in the attachment.

Verizon’s DBIR has been tracking real-world hacker attack techniques for years. Social engineering, which includes phishing as well as other pretexting methods. has been exploding (see graph).

dbir 2016 threats

Social attack are on the rise according to the Verizon DBIR.

What Happens Once They’re in?

The DBIR team also points out that hackers can get in quickly (in just a few days or less). More often than not, IT security departments then take months to discover the attack and learn what was taken.

The secret to the hacker’s stealthiness once they gain a foot hold is that the use stealthy fully undetectable (FUD) malware.  They can fly under the radar of virus scanners while collecting employee key strokes, probing file content, and then removing or exlfiltrating data.

Or if they don’t use a phishing attack to enter, they can find vulnerabilities in public facing web properties and exploit them using SQL injection and other techniques.

And finally, hackers have been quite good at guessing passwords due to poor password creation practices — they can simply log in as the employee.

Bottom line: the attackers enter and leave without triggering any alarms.

A strategy we recommend for real-world defense against this new breed of hackers is to focus on the sensitive data first and then work out your defenses and mitigations from that point — an approach known as “inside out security”.

Three Steps to Inside Out Security

Step One: Taking Inventory of Your IT Infrastructure and Data

Before you can create an inside out security mindset, a good first step is simply to take an inventory of your IT infrastructure.

It’s a requirement found in  many data standards or best practices, such as ISO 27001, Center for Internet Security Critical Security Controls, NIST Critical Infrastructure Cybersecurity (CIS) Framework, or PCI DSS.

Many standards have controls that typically go under the name of asset management or asset categorization. The goal is to force you to first know what’s in your system: you can’t protect what you don’t know about!

Along with the usual hardware (routers, servers, laptops, file server, etc.), asset categorization must also account for the digital elements —important software, apps, OSes, and, of course, data or information assets.

For example, the US National Institute of Standards and Technology (NIST) has its Critical Infrastructure Cybersecurity  Framework, which is a voluntary guideline for protecting IT of power plants, transportation, and other essential services. As with all frameworks, CIS provides an overall structure in which various specific standards are mapped.

The first part of this framework has an “Identify” component, which includes asset inventory subcategories — see ID.AM 1 -6 — and content categorization—see ID.RA-1 and ID.RA-2.

NIST--threats

NIST CIS Framework controls for asset discovery.

Or if you look at PCI DSS 3.x, there are controls to identify storage hardware and more specifically sensitive card holder data – see Requirement 2.4.

pci dss identify asset

PCI DSS: Inventory your assets.

Step Two: Reducing Risk With Access Rights

Along with finding sensitive content, security standards and best practices have additional controls, usually under a risk assessment category, that ask you to look at the access rights of this data. The goal is to learn whether this sensitive data can be accessed by unauthorized users, and then to make adjustments to ensure that this doesn’t happen.

Again referring to the CIS Framework, there’s a “Protect” function that has sub-categories for access controls – see PR.AC-1 to PR.AC-4.

Specifically, there is a control for implementing least privilege access (AC.4), which is a way to limit authorized access by given minimum access rights based on job functions. It’s sometimes referred to as role-based access controls or RBAC.

You can find similar access controls in other standards as well.

The more important point is that you should (or it’s highly recommended) that you implement a continual process of looking at the data, determining risks, and making adjustments to access controls and taking other security measures. This is referred to as continual risk assessment.

Step 3: Data Minimization

In addition to identifying and limiting access, standards and best practices have additional controls to remove or archive PII and other sensitive data that’s no longer needed.

You can find this retention limit in PCI DSS 3.1 – “Keep cardholder data storage to a minimum” found in requirement 3.  The new EU General Data Protection Regulation (GDPR), a law that covers consumer data in the EU zone, also calls for putting a time limit on storing consumer data — it’s mentioned in the key protection by design and default section (article 25).

Ideas for minimizing both data collection and retention as a way to reduce risk are also part of another best practice known as Privacy by Design, which is an important IT security guideline.

The Hard Part: Data Categorization at Scale

The first step, finding the relevant sensitive data and categorizing it, is easier said than done.

Traditionally, categorization of unstructured data has involved a brute force scanning of the relevant parts of the file system, matching against known patterns using regular expressions and other criteria, and then logging those files that match the patterns.

This process, of course, then has to be repeated — new files are being created all the time and old ones updated.

But a brute force approach would start completely from scratch for each scan — beginning at the first file in its list and continuing until the last file on the server is reached.

In other words, the scan doesn’t leverage any information from the last time it crawled through the file system.  So if your file system has 10 million files, which have remained unchanged, and one new has been added since the last scan, then – you guessed it!—the next scan would have to examine 10 million plus one files.

A slightly better approach is to check the modification file times of the files and then only search the contents of those files that have been updated since the last scan. It’s the same strategy that an incremental backup system would use — that is, check the modification times and other metadata that’s associated with the file.

Even so, this is a resource intensive process — CPU and disk accesses — and with large corporate file systems in the tens and hundreds of terabyte range, it may not be very practical. The system would still have to look at at every file’s last access time metadata.

A better idea is to use true incremental scanning. This means that you don’t check each file’s modification date to see if it has changed.

Instead, this optimized technique works from a known list of changed file objects provided by the underlying OS.  In other words, if you can track every file change event—information that an OS kernel has access to – then you can generate a list of just the file objects that should be scanned.

This is a far better approach than a standard (but slow) crawl of the entire file system.

To accomplish this, you’ll need access to the core metadata contained in the file system — minimally, file modification time stamps but ideally other fields related to user access and group access ids.

Solutions and Conclusions

Is there a way to do true incremental scanning as part of a data-centric risk assessment program?

Welcome to the Varonis IDU (Intelligent Data Use) Classification Framework!

Here’s how it works.

The Classification Framework is built on a core Varonis technology, the Varonis Metadata Framework. We have access to internal OS metadata and can track all file and directory events – creation, update, copy, move, and deletes. This is not just another app running on top of the operating system, instead the Metadata Framework integrates at a low-level with the OS while not adding any appreciable overhead.

With this metadata, the Classification Framework can now perform speedy incremental scanning. The Framework is scanning on a small subset of file objects — just the ones that have been changed or newly created—thereby allowing it to jump directly to these files rather than having to scan the complete system

The Varonis IDU Classification Framework is then able to quickly classify the file content using either our own Varonis classification engine or classification metadata from a third-party source, such as RSA.

The IDU Classification Framework works with our DatAdvantage product, which uses file metadata to determine who is the true owner of the sensitive content.

Ultimately, the Varonis solution enables data owners — the managers really in charge of the content and most knowledgeable about proper access rights — to set appropriate file permissions that reduce or eliminate data exposure risk.