+
  Policy’s role in authorization, XACML today, & OpenID’s new policy charter Listen to the podcast  

SC Media Feature: Writing Authorization Policies to Secure Big Data

“Enforcing authorization directly at the data level can be incredibly powerful as it could mean minimal or no changes to the applications that are accessing the data itself,” says Jonas Iggbom.

In the digital age, enterprises are accumulating and storing massive amounts of data. The more data an enterprise has stored, the more critical it is to secure the information. A lack of data security can lead to lost customer information, which can lead to customer churn, financial losses and reputational damages. With so much data at stake, financial losses due to poor big data security can exceed an enterprises’ worst expectation.

When it comes to authorising secure access to data, enterprises typically enforce authorisation policies at different layers in the IT stack. However, enforcing authorisation directly at the data level can be incredibly powerful as it could mean minimal or no changes to the applications that are accessing the data itself. Regardless of what application is accessing the data, authorisation is systematically controlled and consistently enforced.

For traditional relational databases, once configured, authorisation policies are consistently applied across all incoming connections, regardless of application end-point. This allows a central policy to protect multiple databases from queries sent from various applications. However, when securing big data stores, there are aspects that differ from traditional relational database processing.

The challenge of big data

Big data infrastructures often comprise multiple loosely coupled components, and big data stores are often used to store massive amounts of data from numerous sources with multiple applications accessing them simultaneously. Ensuring secure access to sensitive data while maintaining regulatory compliance is much more complicated than for relational databases.

Attribute Based Access Control (ABAC) can enforce enterprise-wide access to big data stores based on business policies and regulations. With ABAC, access rights are granted to users using policies in combination with attributes, thus protecting big data stores from being accessed by unauthorised users. But, how do enterprises protect large amounts of extremely sensitive information such as personally identifiable information (PII), personal health information (PHI) and financial information from authorised users?

The answer is by authoring authorisation policies with ABAC utilising both dynamic data masking and dynamic data filtering. To help understand how this works let’s look at an example within an industry that has a massive amount of sensitive data at its disposal, the healthcare industry.

Writing authorization policies to protect sensitive information

Imagine that an organisation is building a medical platform capable of gathering data from multiple healthcare organisations, hospitals, clinics, insurance companies and government agencies. This platform will help healthcare providers to administer better and more preemptive care for patients.

To achieve this goal, they will need to collect medical data from many patients nationwide. That medical data will contain a large amount extremely sensitive information such as PII, PHI, financial information and more. This wealth of information can potentially be used to save lives but at the same time it is a huge privacy concern and potential liability. For the medical community to benefit from mining and analysing the data, authorisation policies must be written that ensure a patients’ rights will not be breached. This is done using dynamic data masking and dynamic data filtering.

Data masking is about hiding confidential data such as credit card numbers, email addresses, Social Security numbers or other privacy sensitive information, etc. A policy can be implemented that says, “deny access to view the full Social Security number, only display the last four numbers.” This is an example of dynamic data masking.

Dynamic data filtering can be used to filter out all records from a big data store query result set for which the user has no authorisation. Some examples of dynamic data filtering include writing a policy that says, “only allow access to view patient records in a user’s home county” or “only allow disease information to be accessed if user consent has been given.”

ABAC policies control exactly who should gain access to what, where, when, why and how to protect sensitive information. Using ABAC for transparent dynamic access control to a big data store for multiple analytics applications allows enterprises to combine all the benefits of a single data warehouse with the traditional siloed approach, where each application has access only to its own data silo. It adds a dynamic layer of security between applications and big data stores. It facilitates information sharing so that sensitive data can be securely shared with business units, trusted partners and regulatory bodies. This enables enterprises to use big data more securely to analyse data insights and meet business objectives.

Contributed by Jonas Iggbom, VP of sales engineering, Axiomatics.

Read the Full Article at SC Media.

Samantha Berno

Media Contact

Samantha Berno
Corporate Communications Manager
Axiomatics
samantha.berno@axiomatics.com

Archived under: