Data Masking as Part of Your GDPR Compliant Security Posture
May 08, 2018

Nick Turner

With data breaches consistently being in the news over the last several years, it is no wonder why data privacy has become such a hot topic and why the European Union (EU) has put in place General Data Protection Regulation (GDPR) which will become enforceable on May 25, 2018, which is less than a month away!

GDPR applies to any company that collects or processes the personal data of EU data subjects, which could be EU residents or visitors. It regulates how to protect an individual's Personally Identifiable Information (PII), which includes all data that could potentially be used to identify an individual such as their name or e-mail address. And the fines for non-compliance are severe up to 20 million euros or 4% of the worldwide annual revenue of the prior financial year, whichever is higher.

While authorities will be reliant on customers reporting non-compliance and there will be a bigger focus on more serious violations, it is important to identify areas of risk and to take appropriate action. GDPR stresses that software which handles PII follow principles of data protection by design and by default. An appropriated technical and organizational measure to achieve this is with "pseudonymization."

Pseudonymisation is an overarching term for obfuscation approaches like data masking which intends to secure confidential information that directly or indirectly reveal an individual’s identity.

Data masking is the ability to replace or obfuscate sensitive data with a non-sensitive equivalent. So, for example, rather than using credentials that reflect an individual’s name such as "nturner" using something like "xyz9876". Now this approach only works if in the same application that data masking can't indirectly reveal an individual's identity by associating with a captured IP address or e-mail.

Only data that is truly anonymous is exempted from data protection but data that has the potential to reveal identifies is classified as pseudonymized which is still considered personal data. GDPR does incentivize the use of leveraging pseudonymization as part of your security posture to satisfy the design of data protection. In the case of a data breach, if the data is unintelligible to any person who is not authorized to access it then certain notification requirements are no longer required. Additionally, data access requests and disclosure requirements are relaxed when pseudonymization is leveraged.

So how does all of this pertain to the use of software in your infrastructure or in the cloud? For applications where PII is not required as part of use of the platform, it is recommended to employ data masking for user credentials associated with access to the software; and in scenarios where email addresses are needed, that group distribution lists or associated masked email addresses are leveraged. This is so that in the event of a data breach, there is no direct PII available in that system and the information would be unintelligible as it would require access to additional systems to correlate back to an individual.

Of course, that is easier said than done, but again considering the severity of non-compliance the associated work of limiting exposure by employing data masking is a small price to pay that will benefit your organization in the long run.

Nick Turner is Director, IT Operations, at Zenoss

The Latest

September 20, 2018

The latest Accelerate State of DevOps Report from DORA focuses on the importance of the database and shows that integrating it into DevOps avoids time-consuming, unprofitable delays that can derail the benefits DevOps otherwise brings. It highlights four key practices that are essential to successful database DevOps ...

September 18, 2018

To celebrate IT Professionals Day 2018 (this year on September 18), the SolarWinds IT Pro Day 2018: A World Powered by Tech Pros survey explores a "Tech PROactive" world where technology professionals have the time, resources, and ability to use their technology prowess to do absolutely anything ...

September 17, 2018

The role of DevOps in capitalizing on the benefits of hybrid cloud has become increasingly important, with developers and IT operations now working together closer than ever to continuously plan, develop, deliver, integrate, test, and deploy new applications and services in the hybrid cloud ...

September 13, 2018

"Our research provides compelling evidence that smart investments in technology, process, and culture drive profit, quality, and customer outcomes that are important for organizations to stay competitive and relevant -- both today and as we look to the future," said Dr. Nicole Forsgren, co-founder and CEO of DevOps Research and Assessment (DORA), referring to the organization's latest report Accelerate: State of DevOps 2018: Strategies for a New Economy ...

September 12, 2018

This next blog examines the security component of step four of the Twelve-Factor methodology — backing services. Here follows some actionable advice from the WhiteHat Security Addendum Checklist, which developers and ops engineers can follow during the SaaS build and operations stages ...

September 10, 2018

When thinking about security automation, a common concern from security teams is that they don't have the coding capabilities needed to create, implement, and maintain it. So, what are teams to do when internal resources are tight and there isn't budget to hire an outside consultant or "unicorn?" ...

September 06, 2018

In evaluating 316 million incidents, it is clear that attacks against the application are growing in volume and sophistication, and as such, continue to be a major threat to business, according to Security Report for Web Applications (Q2 2018) from tCell ...

September 04, 2018

There's a welcome insight in the 2018 Accelerate State of DevOps Report from DORA, because for the first time it calls out database development as a key technical practice which can drive high performance in DevOps ...

August 29, 2018

While everyone is convinced about the benefits of containers, to really know if you're making progress, you need to measure container performance using KPIs.These KPIs should shed light on how a DevOps team is faring in terms of important parameters like speed, quality, availability, and efficiency. Let's look at the specific KPIs to track for each of these broad categories ...

August 27, 2018

Protego Labs recently discovered that 98 percent of functions in serverless applications are at risk, with 16 percent considered "serious" ...

Share this